Semiconductor manufacturing process
A complex modern semiconductor manufacturing process is normally under constant surveillance via the monitoring of signals/ variables collected from sensors and or process measurement points. However, not all of these signals are equally valuable in a specific monitoring system. The measured signals contain a combination of useful information, irrelevant information as well as noise. Engineers typically have a much larger number of signals than are actually required. If we consider each type of signal as a feature, then feature selection may be applied to identify the most relevant signals. The Process Engineers may then use these signals to determine key factors contributing to yield excursions downstream in the process. This will enable an increase in process throughput, decreased time to learning and reduce the per unit production costs. These signals can be used as features to predict the yield type. And by analysing and trying out different combinations of features, essential signals that are impacting the yield type can be identified.
The data consists of 1567 examples each with 591 features. The dataset presented in this case represents a selection of such features where each example represents a single production entity with associated measured features and the labels represent a simple pass/fail yield for in house line testing. Target column “ –1” corresponds to a pass and “1” corresponds to a fail and the data time stamp is for that specific test point.
We will build a classifier to predict the Pass/Fail yield of a particular process entity and analyse whether all the features are required to build the model or not.
#loading required libraries
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from sklearn.preprocessing import binarize
from sklearn.metrics import confusion_matrix
from sklearn.linear_model import LogisticRegression
from sklearn.neighbors import KNeighborsClassifier
from scipy.stats import zscore
from sklearn.svm import SVC
from sklearn.metrics import confusion_matrix, classification_report,accuracy_score,f1_score
from sklearn.naive_bayes import GaussianNB
from sklearn.metrics import precision_recall_fscore_support
from sklearn.model_selection import train_test_split
from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import RandomForestClassifier
from xgboost import XGBClassifier
from lightgbm import LGBMClassifier
from sklearn.ensemble import AdaBoostClassifier
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.model_selection import StratifiedKFold,cross_val_score,KFold
from imblearn.over_sampling import SMOTE,ADASYN
from imblearn.under_sampling import RandomUnderSampler
from imblearn.over_sampling import RandomOverSampler
from scipy.stats import randint as sp_randint
from scipy.stats import uniform as sp_uniform
from sklearn.model_selection import RandomizedSearchCV,GridSearchCV
from sklearn import metrics
import seaborn as sns
sns.set(color_codes=True)
%matplotlib inline
import warnings
warnings.filterwarnings('ignore')
pd.set_option('max_columns',None)
pd.set_option('max_rows',None)
#import the full data set along with validation set
com=pd.read_csv('./signal-data+Future.csv')
com=com.drop(['Time'],axis=1)
row, column = com.shape
print('The complete dataset contains', row, 'rows and', column, 'columns')
The complete dataset contains 1585 rows and 591 columns
sg=com.iloc[0:1567,:]
row,column=sg.shape
print('The past dataset contains', row, 'rows and', column, 'columns')
The past dataset contains 1567 rows and 591 columns
val=com.iloc[1567:1586,:]
row,column=val.shape
print('The validation dataset contains', row, 'rows and', column, 'columns')
The validation dataset contains 18 rows and 591 columns
#counting the number of missing values in each column
com.isnull().sum()
0 6 1 7 2 14 3 14 4 14 5 14 6 14 7 9 8 2 9 2 10 2 11 2 12 2 13 3 14 3 15 3 16 3 17 3 18 3 19 10 20 0 21 2 22 2 23 2 24 2 25 2 26 2 27 2 28 2 29 2 30 2 31 2 32 1 33 1 34 1 35 1 36 1 37 1 38 1 39 1 40 24 41 24 42 1 43 1 44 1 45 1 46 1 47 1 48 1 49 1 50 1 51 1 52 1 53 4 54 4 55 4 56 4 57 4 58 4 59 7 60 6 61 6 62 6 63 7 64 7 65 7 66 6 67 6 68 6 69 6 70 6 71 6 72 804 73 804 74 6 75 24 76 24 77 24 78 24 79 24 80 24 81 24 82 24 83 1 84 12 85 1359 86 0 87 0 88 0 89 51 90 51 91 6 92 2 93 2 94 6 95 6 96 6 97 6 98 6 99 6 100 6 101 6 102 6 103 2 104 2 105 6 106 6 107 6 108 6 109 1036 110 1036 111 1036 112 722 113 0 114 0 115 0 116 0 117 0 118 24 119 0 120 0 121 9 122 9 123 9 124 9 125 9 126 9 127 9 128 9 129 9 130 9 131 9 132 8 133 8 134 8 135 5 136 6 137 7 138 14 139 14 140 14 141 14 142 14 143 9 144 2 145 2 146 2 147 2 148 2 149 3 150 3 151 3 152 3 153 3 154 3 155 10 156 0 157 1447 158 1447 159 2 160 2 161 2 162 2 163 2 164 2 165 2 166 2 167 2 168 2 169 2 170 1 171 1 172 1 173 1 174 1 175 1 176 1 177 1 178 24 179 1 180 1 181 1 182 1 183 1 184 1 185 1 186 1 187 1 188 1 189 1 190 4 191 4 192 4 193 4 194 4 195 4 196 7 197 6 198 6 199 6 200 7 201 7 202 7 203 6 204 6 205 6 206 6 207 6 208 6 209 6 210 24 211 24 212 24 213 24 214 24 215 24 216 24 217 24 218 1 219 12 220 1359 221 0 222 0 223 0 224 51 225 51 226 6 227 2 228 2 229 6 230 6 231 6 232 6 233 6 234 6 235 6 236 6 237 6 238 2 239 2 240 6 241 6 242 6 243 6 244 1036 245 1036 246 1036 247 722 248 0 249 0 250 0 251 0 252 0 253 24 254 0 255 0 256 9 257 9 258 9 259 9 260 9 261 9 262 9 263 9 264 9 265 9 266 9 267 8 268 8 269 8 270 5 271 6 272 7 273 14 274 14 275 14 276 14 277 14 278 9 279 2 280 2 281 2 282 2 283 2 284 3 285 3 286 3 287 3 288 3 289 3 290 10 291 0 292 1447 293 1447 294 2 295 2 296 2 297 2 298 2 299 2 300 2 301 2 302 2 303 2 304 2 305 1 306 1 307 1 308 1 309 1 310 1 311 1 312 1 313 24 314 24 315 1 316 1 317 1 318 1 319 1 320 1 321 1 322 1 323 1 324 1 325 1 326 4 327 4 328 4 329 4 330 4 331 4 332 7 333 6 334 6 335 6 336 7 337 7 338 7 339 6 340 6 341 6 342 6 343 6 344 6 345 804 346 804 347 6 348 24 349 24 350 24 351 24 352 24 353 24 354 24 355 24 356 1 357 12 358 1359 359 0 360 0 361 0 362 51 363 51 364 6 365 2 366 2 367 6 368 6 369 6 370 6 371 6 372 6 373 6 374 6 375 6 376 2 377 2 378 6 379 6 380 6 381 6 382 1036 383 1036 384 1036 385 722 386 0 387 0 388 0 389 0 390 0 391 24 392 0 393 0 394 9 395 9 396 9 397 9 398 9 399 9 400 9 401 9 402 9 403 9 404 9 405 8 406 8 407 8 408 5 409 6 410 7 411 14 412 14 413 14 414 14 415 14 416 9 417 2 418 2 419 2 420 2 421 2 422 3 423 3 424 3 425 3 426 3 427 3 428 10 429 0 430 2 431 2 432 2 433 2 434 2 435 2 436 2 437 2 438 2 439 2 440 2 441 1 442 1 443 1 444 1 445 1 446 1 447 1 448 1 449 24 450 24 451 1 452 1 453 1 454 1 455 1 456 1 457 1 458 1 459 1 460 1 461 1 462 4 463 4 464 4 465 4 466 4 467 4 468 7 469 6 470 6 471 6 472 7 473 7 474 7 475 6 476 6 477 6 478 6 479 6 480 6 481 6 482 24 483 24 484 24 485 24 486 24 487 24 488 24 489 24 490 1 491 12 492 1359 493 0 494 0 495 0 496 51 497 51 498 6 499 2 500 2 501 6 502 6 503 6 504 6 505 6 506 6 507 6 508 6 509 6 510 2 511 2 512 6 513 6 514 6 515 6 516 1036 517 1036 518 1036 519 722 520 0 521 0 522 0 523 0 524 0 525 24 526 0 527 0 528 9 529 9 530 9 531 9 532 9 533 9 534 9 535 9 536 9 537 9 538 9 539 8 540 8 541 8 542 2 543 2 544 2 545 2 546 261 547 261 548 261 549 261 550 261 551 261 552 261 553 261 554 261 555 261 556 261 557 261 558 1 559 1 560 1 561 1 562 280 563 280 564 280 565 280 566 280 567 280 568 280 569 280 570 0 571 0 572 0 573 0 574 0 575 0 576 0 577 0 578 954 579 954 580 954 581 954 582 1 583 1 584 1 585 1 586 2 587 2 588 2 589 2 Pass/Fail 0 dtype: int64
#5-point summary
com.describe().T
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| 0 | 1579.0 | 3014.292508 | 73.396769 | 2743.2400 | 2966.250000 | 3011.49000 | 3056.495000 | 3356.3500 |
| 1 | 1578.0 | 2496.196984 | 80.345945 | 2158.7500 | 2452.702500 | 2499.72000 | 2539.467500 | 2846.4400 |
| 2 | 1571.0 | 2200.800818 | 29.585955 | 2060.6600 | 2181.155500 | 2201.06670 | 2218.577800 | 2315.2667 |
| 3 | 1571.0 | 1394.127145 | 440.454615 | 0.0000 | 1081.685250 | 1283.43680 | 1590.214800 | 3715.0417 |
| 4 | 1571.0 | 4.162776 | 56.032462 | 0.6815 | 1.017700 | 1.31680 | 1.522250 | 1114.5366 |
| 5 | 1571.0 | 100.000000 | 0.000000 | 100.0000 | 100.000000 | 100.00000 | 100.000000 | 100.0000 |
| 6 | 1571.0 | 101.134096 | 6.214535 | 82.1311 | 97.937800 | 101.56670 | 104.586700 | 129.2522 |
| 7 | 1576.0 | 0.121824 | 0.008913 | 0.0000 | 0.121100 | 0.12240 | 0.123800 | 0.1286 |
| 8 | 1583.0 | 1.463381 | 0.073750 | 1.1910 | 1.411500 | 1.46210 | 1.517150 | 1.6564 |
| 9 | 1583.0 | -0.000791 | 0.015145 | -0.0534 | -0.010800 | -0.00120 | 0.008550 | 0.0749 |
| 10 | 1583.0 | 0.000126 | 0.009297 | -0.0349 | -0.005600 | 0.00040 | 0.005900 | 0.0530 |
| 11 | 1583.0 | 0.964249 | 0.012450 | 0.6554 | 0.958000 | 0.96570 | 0.971200 | 0.9848 |
| 12 | 1583.0 | 199.976048 | 3.320301 | 182.0940 | 198.130950 | 199.56690 | 202.024400 | 272.0451 |
| 13 | 1582.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 14 | 1582.0 | 9.021670 | 2.790252 | 2.2493 | 7.126475 | 8.99955 | 10.870125 | 19.5465 |
| 15 | 1582.0 | 413.134700 | 17.137575 | 333.4486 | 406.210800 | 412.33455 | 419.109275 | 824.9271 |
| 16 | 1582.0 | 9.905677 | 2.390549 | 4.4696 | 9.565375 | 9.85005 | 10.127550 | 102.8677 |
| 17 | 1582.0 | 0.971403 | 0.012009 | 0.5794 | 0.968100 | 0.97240 | 0.976800 | 0.9848 |
| 18 | 1582.0 | 190.068551 | 2.856460 | 169.1774 | 188.300275 | 189.68745 | 192.207450 | 215.5977 |
| 19 | 1575.0 | 12.479976 | 0.221326 | 9.8773 | 12.460000 | 12.49960 | 12.547100 | 12.9898 |
| 20 | 1585.0 | 1.405053 | 0.016702 | 1.1797 | 1.396500 | 1.40600 | 1.415000 | 1.4534 |
| 21 | 1583.0 | -5611.568225 | 642.023494 | -7150.2500 | -5932.625000 | -5522.00000 | -5356.625000 | 0.0000 |
| 22 | 1583.0 | 2696.785060 | 302.978380 | 0.0000 | 2578.000000 | 2664.00000 | 2841.875000 | 3656.2500 |
| 23 | 1583.0 | -3802.598442 | 1375.910580 | -9986.7500 | -4366.000000 | -3821.25000 | -3356.375000 | 2363.0000 |
| 24 | 1583.0 | -298.055801 | 2888.544140 | -14804.5000 | -1481.500000 | -73.25000 | 1365.125000 | 14106.0000 |
| 25 | 1583.0 | 1.201064 | 0.181933 | 0.0000 | 1.087400 | 1.28300 | 1.304300 | 1.3828 |
| 26 | 1583.0 | 1.935395 | 0.196692 | 0.0000 | 1.905000 | 1.98630 | 2.003200 | 2.0528 |
| 27 | 1583.0 | 6.613286 | 1.286365 | 0.0000 | 5.240000 | 7.26430 | 7.329600 | 7.6588 |
| 28 | 1583.0 | 69.440816 | 3.496311 | 59.4000 | 67.300000 | 69.11110 | 72.233300 | 77.9000 |
| 29 | 1583.0 | 2.366385 | 0.408286 | 0.6667 | 2.088900 | 2.37780 | 2.655600 | 3.5111 |
| 30 | 1583.0 | 0.184302 | 0.032937 | 0.0341 | 0.162000 | 0.18680 | 0.207150 | 0.2851 |
| 31 | 1583.0 | 3.671097 | 0.532697 | 2.0698 | 3.362950 | 3.43190 | 3.530950 | 4.8044 |
| 32 | 1584.0 | 85.333982 | 2.017667 | 83.1829 | 84.490500 | 85.12370 | 85.741900 | 105.6038 |
| 33 | 1584.0 | 8.961464 | 1.337546 | 7.6032 | 8.580100 | 8.77000 | 9.063900 | 23.3453 |
| 34 | 1584.0 | 50.580238 | 1.176247 | 49.8348 | 50.251200 | 50.39640 | 50.576700 | 59.7711 |
| 35 | 1584.0 | 64.551170 | 2.560479 | 63.6774 | 64.024800 | 64.16580 | 64.341850 | 94.2641 |
| 36 | 1584.0 | 49.419771 | 1.176248 | 40.2289 | 49.423300 | 49.60360 | 49.748800 | 50.1652 |
| 37 | 1584.0 | 66.217116 | 0.310898 | 64.9193 | 66.039800 | 66.23140 | 66.342600 | 67.9586 |
| 38 | 1584.0 | 86.835483 | 0.455026 | 84.7327 | 86.578300 | 86.82015 | 87.002400 | 88.4188 |
| 39 | 1584.0 | 118.674740 | 1.806683 | 111.7128 | 118.012400 | 118.39660 | 118.939600 | 133.3898 |
| 40 | 1561.0 | 67.881976 | 24.014503 | 1.4340 | 74.570000 | 78.29000 | 80.200000 | 86.1200 |
| 41 | 1561.0 | 3.351617 | 2.350617 | -0.0759 | 2.690000 | 3.07400 | 3.527000 | 37.8800 |
| 42 | 1584.0 | 70.000000 | 0.000000 | 70.0000 | 70.000000 | 70.00000 | 70.000000 | 70.0000 |
| 43 | 1584.0 | 355.568909 | 6.227249 | 342.7545 | 350.817950 | 353.74090 | 360.781600 | 377.2973 |
| 44 | 1584.0 | 10.030348 | 0.175470 | 9.4640 | 9.923500 | 10.03375 | 10.152425 | 11.0530 |
| 45 | 1584.0 | 136.776345 | 7.900727 | 108.8464 | 130.763225 | 136.44410 | 142.115675 | 176.3136 |
| 46 | 1584.0 | 733.750462 | 12.244789 | 699.8139 | 724.445700 | 733.53320 | 741.512700 | 789.7523 |
| 47 | 1584.0 | 1.177540 | 0.189265 | 0.4967 | 0.984850 | 1.24965 | 1.339050 | 1.5111 |
| 48 | 1584.0 | 139.965615 | 4.509777 | 125.7982 | 136.947275 | 139.98815 | 143.187525 | 163.2509 |
| 49 | 1584.0 | 1.000000 | 0.000000 | 1.0000 | 1.000000 | 1.00000 | 1.000000 | 1.0000 |
| 50 | 1584.0 | 632.310871 | 8.684547 | 607.3927 | 625.987750 | 631.38590 | 638.273425 | 667.7418 |
| 51 | 1584.0 | 157.793549 | 60.857960 | 40.2614 | 115.665800 | 183.57495 | 207.274550 | 258.5432 |
| 52 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 53 | 1581.0 | 4.592948 | 0.054945 | 3.7060 | 4.574000 | 4.59600 | 4.617000 | 4.7640 |
| 54 | 1581.0 | 4.838700 | 0.059682 | 3.9320 | 4.816000 | 4.84300 | 4.869000 | 5.0110 |
| 55 | 1581.0 | 2856.388362 | 25.885453 | 2801.0000 | 2836.000000 | 2854.00000 | 2874.000000 | 2936.0000 |
| 56 | 1581.0 | 0.928828 | 0.006809 | 0.8755 | 0.925200 | 0.93100 | 0.933100 | 0.9378 |
| 57 | 1581.0 | 0.949176 | 0.004189 | 0.9319 | 0.946600 | 0.94930 | 0.952000 | 0.9598 |
| 58 | 1581.0 | 4.593531 | 0.084805 | 4.2199 | 4.531900 | 4.57290 | 4.668600 | 4.8475 |
| 59 | 1578.0 | 3.084001 | 9.673143 | -28.9882 | -1.857725 | 0.96635 | 4.419550 | 168.1455 |
| 60 | 1579.0 | 355.196250 | 6.029050 | 324.7145 | 350.635500 | 353.84730 | 359.743150 | 373.8664 |
| 61 | 1579.0 | 10.419764 | 0.277034 | 9.4611 | 10.280250 | 10.43430 | 10.590500 | 11.7849 |
| 62 | 1579.0 | 116.556562 | 8.650785 | 81.4900 | 112.066800 | 116.21550 | 120.959550 | 287.1509 |
| 63 | 1578.0 | 13.956496 | 7.099075 | 1.6591 | 10.329200 | 13.21365 | 16.317950 | 188.0923 |
| 64 | 1578.0 | 20.517815 | 4.993383 | 6.4482 | 17.333875 | 20.01185 | 22.802525 | 48.9882 |
| 65 | 1578.0 | 27.113969 | 7.124581 | 4.3080 | 23.046825 | 26.25575 | 29.908825 | 118.0836 |
| 66 | 1579.0 | 706.762801 | 11.684098 | 632.4226 | 698.783000 | 706.47610 | 714.646900 | 770.6084 |
| 67 | 1579.0 | 16.535837 | 305.748026 | 0.4137 | 0.891500 | 0.97670 | 1.064850 | 7272.8283 |
| 68 | 1579.0 | 147.439537 | 4.231705 | 87.0255 | 145.222700 | 147.59640 | 149.963600 | 167.8309 |
| 69 | 1579.0 | 1.000000 | 0.000000 | 1.0000 | 1.000000 | 1.00000 | 1.000000 | 1.0000 |
| 70 | 1579.0 | 619.195005 | 9.587303 | 581.7773 | 612.793650 | 619.08000 | 625.294500 | 722.6018 |
| 71 | 1579.0 | 104.084796 | 31.603902 | 21.4332 | 87.049250 | 102.55320 | 115.265750 | 238.4775 |
| 72 | 781.0 | 150.429675 | 18.326168 | -59.4777 | 145.326300 | 152.37080 | 158.554600 | 175.4132 |
| 73 | 781.0 | 468.034924 | 17.551239 | 456.0447 | 464.462800 | 466.09590 | 467.922000 | 692.4256 |
| 74 | 1579.0 | 0.002657 | 0.105583 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 4.1955 |
| 75 | 1561.0 | -0.006944 | 0.022271 | -0.1049 | -0.019500 | -0.00630 | 0.007100 | 0.2315 |
| 76 | 1561.0 | -0.029532 | 0.033149 | -0.1862 | -0.052200 | -0.02900 | -0.006800 | 0.0723 |
| 77 | 1561.0 | -0.007102 | 0.031326 | -0.1046 | -0.029500 | -0.00960 | 0.009200 | 0.1331 |
| 78 | 1561.0 | -0.013916 | 0.048439 | -0.3482 | -0.047500 | -0.01280 | 0.012100 | 0.2492 |
| 79 | 1561.0 | 0.003476 | 0.023186 | -0.0568 | -0.010800 | 0.00060 | 0.013200 | 0.1013 |
| 80 | 1561.0 | -0.018753 | 0.049366 | -0.1437 | -0.045600 | -0.00890 | 0.009000 | 0.1186 |
| 81 | 1561.0 | -0.021181 | 0.017037 | -0.0982 | -0.027200 | -0.01960 | -0.012000 | 0.0584 |
| 82 | 1561.0 | 0.006116 | 0.036022 | -0.2129 | -0.017700 | 0.00770 | 0.027000 | 0.1437 |
| 83 | 1584.0 | 7.450433 | 0.516830 | 5.8257 | 7.104000 | 7.46845 | 7.805125 | 8.9904 |
| 84 | 1573.0 | 0.133130 | 0.005051 | 0.1174 | 0.129800 | 0.13310 | 0.136300 | 0.1505 |
| 85 | 226.0 | 0.112783 | 0.002928 | 0.1053 | 0.110725 | 0.11355 | 0.114900 | 0.1184 |
| 86 | 1585.0 | 2.401878 | 0.037482 | 2.2425 | 2.376600 | 2.40390 | 2.428700 | 2.5555 |
| 87 | 1585.0 | 0.982407 | 0.012819 | 0.7749 | 0.975000 | 0.98740 | 0.989700 | 0.9935 |
| 88 | 1585.0 | 1807.423799 | 53.920922 | 1627.4714 | 1776.880300 | 1809.02160 | 1841.820600 | 2105.1823 |
| 89 | 1534.0 | 0.190303 | 0.068729 | 0.1113 | 0.169400 | 0.19010 | 0.200500 | 1.4727 |
| 90 | 1534.0 | 8826.216152 | 396.739803 | 7397.3100 | 8564.350125 | 8825.40000 | 9065.457400 | 10746.6000 |
| 91 | 1579.0 | 0.002097 | 0.088061 | -0.3570 | -0.043450 | 0.00000 | 0.050650 | 0.3627 |
| 92 | 1583.0 | 0.000489 | 0.003236 | -0.0126 | -0.001200 | 0.00040 | 0.002000 | 0.0281 |
| 93 | 1583.0 | -0.000544 | 0.002996 | -0.0171 | -0.001600 | -0.00020 | 0.001000 | 0.0133 |
| 94 | 1579.0 | -0.000030 | 0.000174 | -0.0020 | -0.000100 | 0.00000 | 0.000100 | 0.0011 |
| 95 | 1579.0 | 0.000061 | 0.000105 | -0.0009 | 0.000000 | 0.00000 | 0.000100 | 0.0009 |
| 96 | 1579.0 | 0.017350 | 0.219337 | -1.4803 | -0.088550 | 0.00410 | 0.122400 | 2.5093 |
| 97 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 98 | 1579.0 | -0.019204 | 0.426868 | -5.2717 | -0.218850 | 0.00000 | 0.189250 | 2.5698 |
| 99 | 1579.0 | 0.001622 | 0.062562 | -0.5283 | -0.029800 | 0.00000 | 0.029800 | 0.8854 |
| 100 | 1579.0 | -0.000020 | 0.000356 | -0.0030 | -0.000200 | 0.00000 | 0.000200 | 0.0023 |
| 101 | 1579.0 | -0.000008 | 0.000220 | -0.0024 | -0.000100 | 0.00000 | 0.000100 | 0.0017 |
| 102 | 1579.0 | 0.001330 | 0.062969 | -0.5353 | -0.035200 | 0.00000 | 0.033650 | 0.2979 |
| 103 | 1583.0 | -0.009737 | 0.003089 | -0.0329 | -0.011800 | -0.01010 | -0.008000 | 0.0203 |
| 104 | 1583.0 | -0.000015 | 0.000848 | -0.0119 | -0.000400 | 0.00000 | 0.000400 | 0.0071 |
| 105 | 1579.0 | -0.000482 | 0.003206 | -0.0281 | -0.001900 | -0.00020 | 0.001100 | 0.0127 |
| 106 | 1579.0 | 0.000544 | 0.002974 | -0.0133 | -0.001000 | 0.00020 | 0.001600 | 0.0172 |
| 107 | 1579.0 | -0.002186 | 0.087847 | -0.5226 | -0.048900 | 0.00000 | 0.048600 | 0.4856 |
| 108 | 1579.0 | -0.011122 | 0.087465 | -0.3454 | -0.065800 | -0.01180 | 0.037850 | 0.3938 |
| 109 | 549.0 | 0.979993 | 0.008695 | 0.7848 | 0.978800 | 0.98100 | 0.982300 | 0.9842 |
| 110 | 549.0 | 101.318253 | 1.880087 | 88.1938 | 100.389000 | 101.48170 | 102.078100 | 106.9227 |
| 111 | 549.0 | 231.818898 | 2.105318 | 213.0083 | 230.373800 | 231.20120 | 233.036100 | 236.9546 |
| 112 | 863.0 | 0.457625 | 0.048635 | 0.0000 | 0.459300 | 0.46290 | 0.466450 | 0.4885 |
| 113 | 1585.0 | 0.945351 | 0.012136 | 0.8534 | 0.938600 | 0.94630 | 0.952200 | 0.9763 |
| 114 | 1585.0 | 0.000121 | 0.001659 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0414 |
| 115 | 1585.0 | 747.325221 | 48.869527 | 544.0254 | 721.064600 | 750.67320 | 776.768000 | 924.5318 |
| 116 | 1585.0 | 0.987143 | 0.009467 | 0.8900 | 0.989500 | 0.99050 | 0.990900 | 0.9924 |
| 117 | 1585.0 | 58.625267 | 6.448657 | 52.8068 | 57.979400 | 58.54920 | 59.132700 | 311.7344 |
| 118 | 1561.0 | 0.598414 | 0.008086 | 0.5274 | 0.594100 | 0.59900 | 0.603400 | 0.6245 |
| 119 | 1585.0 | 0.970769 | 0.008932 | 0.8411 | 0.964800 | 0.96940 | 0.978300 | 0.9827 |
| 120 | 1585.0 | 6.311415 | 0.124091 | 5.1259 | 6.247000 | 6.31430 | 6.376000 | 7.5220 |
| 121 | 1576.0 | 15.796567 | 0.099420 | 15.4600 | 15.730000 | 15.79000 | 15.860000 | 16.0700 |
| 122 | 1576.0 | 3.885919 | 0.907184 | 1.6710 | 3.180000 | 3.86500 | 4.392000 | 6.8890 |
| 123 | 1576.0 | 15.829822 | 0.108050 | 15.1700 | 15.770000 | 15.83000 | 15.900000 | 16.1000 |
| 124 | 1576.0 | 15.794924 | 0.113835 | 15.4300 | 15.730000 | 15.78000 | 15.870000 | 16.1000 |
| 125 | 1576.0 | 1.181530 | 0.281107 | 0.3122 | 0.964900 | 1.13500 | 1.338000 | 2.4650 |
| 126 | 1576.0 | 2.751291 | 0.253059 | 2.3400 | 2.572750 | 2.73600 | 2.873000 | 3.9910 |
| 127 | 1576.0 | 0.646229 | 0.136462 | 0.3161 | 0.545100 | 0.65190 | 0.712400 | 1.1750 |
| 128 | 1576.0 | 3.193187 | 0.263039 | 0.0000 | 3.076000 | 3.19800 | 3.311500 | 3.8950 |
| 129 | 1576.0 | -0.548409 | 1.215485 | -3.7790 | -0.898800 | -0.14190 | 0.047300 | 2.4580 |
| 130 | 1576.0 | 0.745284 | 0.082164 | 0.4199 | 0.688700 | 0.75890 | 0.814275 | 0.8884 |
| 131 | 1576.0 | 0.997809 | 0.002248 | 0.9936 | 0.996400 | 0.99775 | 0.998900 | 1.0190 |
| 132 | 1577.0 | 2.318613 | 0.052988 | 2.1911 | 2.277300 | 2.31240 | 2.358300 | 2.4723 |
| 133 | 1577.0 | 1003.958739 | 6.574892 | 980.4510 | 999.939100 | 1003.97120 | 1008.670600 | 1020.9944 |
| 134 | 1577.0 | 39.393026 | 2.980539 | 33.3658 | 37.368900 | 38.90260 | 40.804600 | 64.1287 |
| 135 | 1580.0 | 117.788608 | 57.269036 | 58.0000 | 91.000000 | 109.00000 | 127.000000 | 994.0000 |
| 136 | 1579.0 | 138.254212 | 53.765905 | 36.1000 | 90.250000 | 135.00000 | 180.900000 | 295.8000 |
| 137 | 1578.0 | 122.753992 | 52.148369 | 19.2000 | 81.325000 | 117.95000 | 161.600000 | 334.7000 |
| 138 | 1571.0 | 57.548057 | 12.298160 | 19.8000 | 50.750000 | 55.89990 | 62.900100 | 141.7998 |
| 139 | 1571.0 | 415.026093 | 262.375208 | 0.0000 | 241.589000 | 334.14290 | 494.806000 | 1770.6909 |
| 140 | 1571.0 | 25.782006 | 504.015305 | 0.0319 | 0.131650 | 0.23520 | 0.434800 | 9998.8944 |
| 141 | 1571.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 142 | 1571.0 | 6.637664 | 3.535575 | 1.7400 | 5.110000 | 6.26000 | 7.510000 | 103.3900 |
| 143 | 1576.0 | 0.004165 | 0.001277 | 0.0000 | 0.003300 | 0.00390 | 0.004900 | 0.0121 |
| 144 | 1583.0 | 0.119767 | 0.061083 | 0.0324 | 0.083900 | 0.10730 | 0.132600 | 0.6253 |
| 145 | 1583.0 | 0.063674 | 0.026474 | 0.0214 | 0.048100 | 0.05870 | 0.071900 | 0.2507 |
| 146 | 1583.0 | 0.054951 | 0.021746 | 0.0227 | 0.042350 | 0.05000 | 0.061300 | 0.2479 |
| 147 | 1583.0 | 0.017501 | 0.026990 | 0.0043 | 0.010000 | 0.01600 | 0.021400 | 0.9783 |
| 148 | 1583.0 | 8.501850 | 18.638995 | 1.4208 | 6.362800 | 7.93660 | 9.642150 | 742.9421 |
| 149 | 1582.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 150 | 1582.0 | 6.825557 | 3.247664 | 1.3370 | 4.471000 | 5.95950 | 8.279000 | 22.3180 |
| 151 | 1582.0 | 14.029675 | 30.827761 | 2.0200 | 8.119250 | 11.00000 | 14.370500 | 536.5640 |
| 152 | 1582.0 | 1.190899 | 23.230774 | 0.1544 | 0.374675 | 0.46885 | 0.681975 | 924.3780 |
| 153 | 1582.0 | 0.011976 | 0.009319 | 0.0036 | 0.007300 | 0.01110 | 0.014975 | 0.2389 |
| 154 | 1582.0 | 7.730844 | 5.228469 | 1.2438 | 5.929900 | 7.53425 | 9.106900 | 191.5478 |
| 155 | 1575.0 | 0.514156 | 1.157616 | 0.1400 | 0.245000 | 0.32000 | 0.450000 | 12.7100 |
| 156 | 1585.0 | 0.058010 | 0.078743 | 0.0111 | 0.036300 | 0.04870 | 0.066700 | 2.2016 |
| 157 | 138.0 | 0.047104 | 0.039538 | 0.0118 | 0.027050 | 0.03545 | 0.048875 | 0.2876 |
| 158 | 138.0 | 1039.650738 | 406.848810 | 234.0996 | 721.675050 | 1020.30005 | 1277.750125 | 2505.2998 |
| 159 | 1583.0 | 897.887555 | 1010.161657 | 0.0000 | 414.000000 | 626.00000 | 970.000000 | 7791.0000 |
| 160 | 1583.0 | 561.144662 | 585.082305 | 0.0000 | 295.000000 | 439.00000 | 626.500000 | 4170.0000 |
| 161 | 1583.0 | 4040.008212 | 4225.452719 | 0.0000 | 1302.500000 | 2595.00000 | 5007.500000 | 37943.0000 |
| 162 | 1583.0 | 4757.310802 | 6527.835207 | 0.0000 | 448.500000 | 1770.00000 | 6267.500000 | 36871.0000 |
| 163 | 1583.0 | 0.141819 | 0.127686 | 0.0000 | 0.091000 | 0.12000 | 0.154000 | 0.9570 |
| 164 | 1583.0 | 0.131567 | 0.255046 | 0.0000 | 0.068000 | 0.08900 | 0.116500 | 1.8170 |
| 165 | 1583.0 | 0.257764 | 0.429096 | 0.0000 | 0.130500 | 0.18300 | 0.255000 | 3.2860 |
| 166 | 1583.0 | 2.784839 | 1.115894 | 0.8000 | 2.100000 | 2.60000 | 3.200000 | 21.1000 |
| 167 | 1583.0 | 1.232407 | 0.630237 | 0.3000 | 0.900000 | 1.20000 | 1.500000 | 16.3000 |
| 168 | 1583.0 | 0.124097 | 0.047500 | 0.0330 | 0.090000 | 0.11900 | 0.150000 | 0.7250 |
| 169 | 1583.0 | 0.398541 | 0.198523 | 0.0460 | 0.223500 | 0.40800 | 0.535000 | 1.1430 |
| 170 | 1584.0 | 0.685071 | 0.158043 | 0.2979 | 0.575600 | 0.68600 | 0.797425 | 1.1530 |
| 171 | 1584.0 | 0.120101 | 0.061219 | 0.0089 | 0.079800 | 0.11250 | 0.140400 | 0.4940 |
| 172 | 1584.0 | 0.320554 | 0.071284 | 0.1287 | 0.276825 | 0.32410 | 0.371400 | 0.5484 |
| 173 | 1584.0 | 0.576334 | 0.095542 | 0.2538 | 0.517100 | 0.57850 | 0.634700 | 0.8643 |
| 174 | 1584.0 | 0.320554 | 0.071288 | 0.1287 | 0.276800 | 0.32410 | 0.371500 | 0.5484 |
| 175 | 1584.0 | 0.779471 | 0.116887 | 0.4616 | 0.693625 | 0.77090 | 0.845300 | 1.1720 |
| 176 | 1584.0 | 0.244753 | 0.074805 | 0.0735 | 0.196200 | 0.24360 | 0.293575 | 0.4411 |
| 177 | 1584.0 | 0.395465 | 0.283014 | 0.0470 | 0.222000 | 0.29900 | 0.426000 | 1.8580 |
| 178 | 1561.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 179 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 180 | 1584.0 | 18.992052 | 3.309355 | 9.4000 | 16.840000 | 18.67000 | 20.942500 | 48.6700 |
| 181 | 1584.0 | 0.545605 | 0.224004 | 0.0930 | 0.377750 | 0.52100 | 0.688000 | 3.5730 |
| 182 | 1584.0 | 10.792203 | 4.149844 | 3.1700 | 7.740000 | 10.25500 | 13.350000 | 55.0000 |
| 183 | 1584.0 | 26.653001 | 6.844530 | 5.0140 | 21.150750 | 27.13500 | 31.689500 | 72.9470 |
| 184 | 1584.0 | 0.144518 | 0.109686 | 0.0297 | 0.101900 | 0.13245 | 0.169000 | 3.2283 |
| 185 | 1584.0 | 7.355650 | 7.151683 | 1.9400 | 5.370000 | 6.73500 | 8.442500 | 267.9100 |
| 186 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 187 | 1584.0 | 17.931717 | 8.576702 | 6.2200 | 14.515000 | 17.87000 | 20.865000 | 307.9300 |
| 188 | 1584.0 | 43.340557 | 21.726000 | 6.6130 | 24.783500 | 40.42550 | 57.710500 | 191.8300 |
| 189 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 190 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 191 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 192 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 193 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 194 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 195 | 1581.0 | 0.289267 | 0.407817 | 0.0800 | 0.221000 | 0.25900 | 0.296000 | 4.8380 |
| 196 | 1578.0 | 8.641984 | 15.637620 | 1.7500 | 5.020000 | 6.75500 | 9.497500 | 396.1100 |
| 197 | 1579.0 | 20.078923 | 10.498951 | 9.2200 | 17.125000 | 19.38000 | 21.455000 | 252.8700 |
| 198 | 1579.0 | 0.559466 | 0.536216 | 0.0900 | 0.296000 | 0.42700 | 0.728000 | 10.0170 |
| 199 | 1579.0 | 11.524015 | 16.353755 | 2.7700 | 6.760000 | 8.58000 | 11.480000 | 390.1200 |
| 200 | 1578.0 | 17.561179 | 8.659241 | 3.2100 | 14.095000 | 17.21000 | 20.140000 | 199.6200 |
| 201 | 1578.0 | 7.802541 | 5.088916 | 0.0000 | 4.990000 | 6.69500 | 9.480000 | 126.5300 |
| 202 | 1578.0 | 10.130608 | 14.546586 | 0.0000 | 6.074000 | 8.44950 | 11.906500 | 490.5610 |
| 203 | 1579.0 | 30.027821 | 17.386568 | 7.7280 | 24.603500 | 30.08500 | 33.508500 | 500.3490 |
| 204 | 1579.0 | 31.854532 | 561.879195 | 0.0429 | 0.114350 | 0.15870 | 0.232000 | 9998.4483 |
| 205 | 1579.0 | 9.070323 | 11.487021 | 2.3000 | 6.060000 | 7.76000 | 9.965000 | 320.0500 |
| 206 | 1579.0 | 0.001267 | 0.050331 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 2.0000 |
| 207 | 1579.0 | 20.356295 | 17.414689 | 4.0100 | 16.335000 | 19.72000 | 22.370000 | 457.6500 |
| 208 | 1579.0 | 73.482973 | 28.101636 | 5.3590 | 56.307000 | 73.40500 | 90.680000 | 172.3490 |
| 209 | 1579.0 | 0.029227 | 1.161397 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 46.1500 |
| 210 | 1561.0 | 0.088819 | 0.042036 | 0.0319 | 0.065400 | 0.07960 | 0.099500 | 0.5164 |
| 211 | 1561.0 | 0.056748 | 0.024946 | 0.0022 | 0.043800 | 0.05320 | 0.064200 | 0.3227 |
| 212 | 1561.0 | 0.051757 | 0.031746 | 0.0071 | 0.032500 | 0.04180 | 0.063600 | 0.5941 |
| 213 | 1561.0 | 0.061277 | 0.061229 | 0.0037 | 0.036700 | 0.05620 | 0.073700 | 1.2837 |
| 214 | 1561.0 | 0.083039 | 0.056213 | 0.0193 | 0.056600 | 0.07510 | 0.093300 | 0.7615 |
| 215 | 1561.0 | 0.081029 | 0.030440 | 0.0059 | 0.063100 | 0.08250 | 0.098400 | 0.3429 |
| 216 | 1561.0 | 0.083317 | 0.025794 | 0.0097 | 0.069300 | 0.08460 | 0.097300 | 0.2828 |
| 217 | 1561.0 | 0.071997 | 0.047659 | 0.0079 | 0.046100 | 0.06170 | 0.086500 | 0.6744 |
| 218 | 1584.0 | 3.770322 | 1.172182 | 1.0340 | 2.944225 | 3.62375 | 4.398900 | 8.8015 |
| 219 | 1573.0 | 0.003255 | 0.001657 | 0.0007 | 0.002300 | 0.00300 | 0.003800 | 0.0163 |
| 220 | 226.0 | 0.009213 | 0.001989 | 0.0057 | 0.007800 | 0.00895 | 0.010300 | 0.0240 |
| 221 | 1585.0 | 0.060919 | 0.023309 | 0.0200 | 0.040400 | 0.06170 | 0.076500 | 0.2305 |
| 222 | 1585.0 | 0.008823 | 0.055634 | 0.0003 | 0.001400 | 0.00230 | 0.005600 | 0.9911 |
| 223 | 1585.0 | 122.863913 | 54.986048 | 32.2637 | 95.169200 | 119.67380 | 144.502800 | 1768.8802 |
| 224 | 1534.0 | 0.060970 | 0.082607 | 0.0093 | 0.029800 | 0.03990 | 0.061975 | 1.4361 |
| 225 | 1534.0 | 1042.428087 | 432.773858 | 168.7998 | 720.024900 | 968.00050 | 1265.574950 | 3601.2998 |
| 226 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 227 | 1583.0 | 0.019233 | 0.010853 | 0.0062 | 0.013300 | 0.01660 | 0.021300 | 0.1541 |
| 228 | 1583.0 | 0.017921 | 0.010774 | 0.0072 | 0.012700 | 0.01550 | 0.020250 | 0.2133 |
| 229 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 230 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 231 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 232 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 233 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 234 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 235 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 236 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 237 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 238 | 1583.0 | 0.004799 | 0.001698 | 0.0013 | 0.003700 | 0.00460 | 0.005700 | 0.0244 |
| 239 | 1583.0 | 0.004573 | 0.001438 | 0.0014 | 0.003600 | 0.00440 | 0.005300 | 0.0236 |
| 240 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 241 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 242 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 243 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 244 | 549.0 | 0.005755 | 0.084618 | 0.0003 | 0.001200 | 0.00170 | 0.002600 | 1.9844 |
| 245 | 549.0 | 1.729723 | 4.335614 | 0.2914 | 0.911500 | 1.18510 | 1.761800 | 99.9022 |
| 246 | 549.0 | 4.148742 | 10.045084 | 1.1022 | 2.725900 | 3.67300 | 4.479700 | 237.1837 |
| 247 | 863.0 | 0.053539 | 0.066747 | 0.0000 | 0.019300 | 0.02740 | 0.051550 | 0.4914 |
| 248 | 1585.0 | 0.025221 | 0.049014 | 0.0030 | 0.014700 | 0.02100 | 0.027300 | 0.9732 |
| 249 | 1585.0 | 0.001053 | 0.015681 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.4138 |
| 250 | 1585.0 | 109.817499 | 54.514579 | 21.0107 | 76.342300 | 103.18380 | 132.149400 | 1119.7042 |
| 251 | 1585.0 | 0.004259 | 0.037263 | 0.0003 | 0.000700 | 0.00100 | 0.001300 | 0.9909 |
| 252 | 1585.0 | 4.629044 | 63.988267 | 0.7673 | 2.215000 | 2.86650 | 3.794800 | 2549.9885 |
| 253 | 1561.0 | 0.033215 | 0.022336 | 0.0094 | 0.024500 | 0.03080 | 0.037900 | 0.4517 |
| 254 | 1585.0 | 0.013947 | 0.009127 | 0.0017 | 0.004800 | 0.01500 | 0.021300 | 0.0787 |
| 255 | 1585.0 | 0.404053 | 0.119882 | 0.1269 | 0.308800 | 0.40530 | 0.480400 | 0.9255 |
| 256 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 257 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 258 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 259 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 260 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 261 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 262 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 263 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 264 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 265 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 266 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 267 | 1577.0 | 0.070216 | 0.029692 | 0.0198 | 0.044000 | 0.06990 | 0.091600 | 0.1578 |
| 268 | 1577.0 | 19.627058 | 7.412139 | 6.0980 | 13.880000 | 18.14000 | 24.671000 | 40.8550 |
| 269 | 1577.0 | 3.772727 | 1.149138 | 1.3017 | 2.956500 | 3.69120 | 4.374200 | 10.1529 |
| 270 | 1580.0 | 29.249472 | 8.366506 | 15.5471 | 24.969100 | 28.77350 | 31.702200 | 158.5260 |
| 271 | 1579.0 | 46.106248 | 17.845203 | 10.4015 | 30.084300 | 45.71840 | 59.593000 | 132.6479 |
| 272 | 1578.0 | 41.329228 | 17.710290 | 6.9431 | 27.094350 | 40.25975 | 54.259050 | 122.1174 |
| 273 | 1571.0 | 20.159401 | 3.819475 | 8.6512 | 18.240700 | 19.56300 | 22.089100 | 43.5737 |
| 274 | 1571.0 | 135.734273 | 85.298521 | 0.0000 | 81.231500 | 110.23120 | 161.828050 | 659.1696 |
| 275 | 1571.0 | 8.594530 | 167.980623 | 0.0111 | 0.044700 | 0.07810 | 0.144900 | 3332.5964 |
| 276 | 1571.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 277 | 1571.0 | 2.210564 | 1.191129 | 0.5615 | 1.697700 | 2.08360 | 2.527450 | 32.1709 |
| 278 | 1576.0 | 0.001116 | 0.000339 | 0.0000 | 0.000900 | 0.00110 | 0.001300 | 0.0034 |
| 279 | 1583.0 | 0.040974 | 0.020208 | 0.0107 | 0.028300 | 0.03710 | 0.045600 | 0.1884 |
| 280 | 1583.0 | 0.018033 | 0.006464 | 0.0073 | 0.014200 | 0.01690 | 0.020700 | 0.0755 |
| 281 | 1583.0 | 0.015084 | 0.005519 | 0.0069 | 0.011900 | 0.01390 | 0.016600 | 0.0597 |
| 282 | 1583.0 | 0.005800 | 0.008509 | 0.0016 | 0.003400 | 0.00530 | 0.007100 | 0.3083 |
| 283 | 1583.0 | 2.812655 | 5.832365 | 0.5050 | 2.211500 | 2.66300 | 3.159500 | 232.8049 |
| 284 | 1582.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 285 | 1582.0 | 2.122992 | 0.963970 | 0.4611 | 1.442100 | 1.87960 | 2.607050 | 6.8698 |
| 286 | 1582.0 | 4.252710 | 9.708540 | 0.7280 | 2.470800 | 3.37215 | 4.299200 | 207.0161 |
| 287 | 1582.0 | 0.365697 | 7.344204 | 0.0513 | 0.115125 | 0.13905 | 0.198825 | 292.2274 |
| 288 | 1582.0 | 0.003940 | 0.002928 | 0.0012 | 0.002400 | 0.00370 | 0.004900 | 0.0749 |
| 289 | 1582.0 | 2.588217 | 1.613592 | 0.3960 | 2.092875 | 2.55665 | 3.037350 | 59.5187 |
| 290 | 1575.0 | 0.126104 | 0.290340 | 0.0416 | 0.065350 | 0.08410 | 0.118500 | 4.4203 |
| 291 | 1585.0 | 0.019890 | 0.025411 | 0.0038 | 0.012500 | 0.01690 | 0.023600 | 0.6915 |
| 292 | 138.0 | 0.014487 | 0.011494 | 0.0041 | 0.008725 | 0.01100 | 0.014925 | 0.0831 |
| 293 | 138.0 | 335.551157 | 137.692483 | 82.3233 | 229.809450 | 317.86710 | 403.989300 | 879.2260 |
| 294 | 1583.0 | 409.669525 | 493.030177 | 0.0000 | 185.921000 | 280.07020 | 431.473750 | 3933.7550 |
| 295 | 1583.0 | 255.902532 | 289.209968 | 0.0000 | 130.390300 | 196.04660 | 274.486700 | 2005.8744 |
| 296 | 1583.0 | 1867.399073 | 1968.741379 | 0.0000 | 597.084600 | 1192.74600 | 2318.891600 | 15559.9525 |
| 297 | 1583.0 | 2322.995462 | 3214.309116 | 0.0000 | 209.795500 | 812.73490 | 3133.348850 | 18520.4683 |
| 298 | 1583.0 | 0.064704 | 0.067448 | 0.0000 | 0.040550 | 0.05270 | 0.069200 | 0.5264 |
| 299 | 1583.0 | 0.062255 | 0.137661 | 0.0000 | 0.030050 | 0.04000 | 0.052200 | 1.0312 |
| 300 | 1583.0 | 0.121573 | 0.231013 | 0.0000 | 0.058700 | 0.08250 | 0.115600 | 1.8123 |
| 301 | 1583.0 | 0.909174 | 0.330853 | 0.3100 | 0.717200 | 0.85940 | 1.044900 | 5.7110 |
| 302 | 1583.0 | 0.402273 | 0.196719 | 0.1118 | 0.295800 | 0.37900 | 0.475050 | 5.1549 |
| 303 | 1583.0 | 0.040260 | 0.014469 | 0.0108 | 0.029900 | 0.03870 | 0.048550 | 0.2258 |
| 304 | 1583.0 | 0.131440 | 0.065062 | 0.0138 | 0.070250 | 0.13680 | 0.178100 | 0.3337 |
| 305 | 1584.0 | 0.265100 | 0.057499 | 0.1171 | 0.225000 | 0.26430 | 0.307700 | 0.4750 |
| 306 | 1584.0 | 0.048639 | 0.025567 | 0.0034 | 0.033100 | 0.04480 | 0.055300 | 0.2246 |
| 307 | 1584.0 | 0.129084 | 0.027466 | 0.0549 | 0.113700 | 0.12950 | 0.149725 | 0.2112 |
| 308 | 1584.0 | 0.218552 | 0.033543 | 0.0913 | 0.197600 | 0.21960 | 0.237925 | 0.3239 |
| 309 | 1584.0 | 0.129084 | 0.027469 | 0.0549 | 0.113700 | 0.12950 | 0.149725 | 0.2112 |
| 310 | 1584.0 | 0.305200 | 0.043586 | 0.1809 | 0.278700 | 0.30290 | 0.332025 | 0.4438 |
| 311 | 1584.0 | 0.097351 | 0.028772 | 0.0328 | 0.077525 | 0.09770 | 0.115900 | 0.1784 |
| 312 | 1584.0 | 0.160326 | 0.117286 | 0.0224 | 0.091600 | 0.12160 | 0.160200 | 0.7549 |
| 313 | 1561.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 314 | 1561.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 315 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 316 | 1584.0 | 5.970061 | 1.016953 | 2.7882 | 5.288575 | 5.82390 | 6.536325 | 13.0958 |
| 317 | 1584.0 | 0.172315 | 0.072291 | 0.0283 | 0.117200 | 0.16215 | 0.217950 | 1.0034 |
| 318 | 1584.0 | 3.190791 | 1.211045 | 0.9848 | 2.322250 | 2.91300 | 4.018350 | 15.8934 |
| 319 | 1584.0 | 7.913238 | 2.177912 | 1.6574 | 6.250450 | 8.38425 | 9.481850 | 20.0455 |
| 320 | 1584.0 | 0.043006 | 0.031733 | 0.0084 | 0.031175 | 0.03965 | 0.050125 | 0.9474 |
| 321 | 1584.0 | 2.259924 | 2.106291 | 0.6114 | 1.669350 | 2.07440 | 2.630800 | 79.1515 |
| 322 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 323 | 1584.0 | 5.390534 | 2.509134 | 1.7101 | 4.273850 | 5.44545 | 6.343825 | 89.1917 |
| 324 | 1584.0 | 13.363705 | 6.616202 | 2.2345 | 7.602800 | 12.56035 | 17.942475 | 51.8678 |
| 325 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 326 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 327 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 328 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 329 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 330 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 331 | 1581.0 | 0.083459 | 0.064730 | 0.0224 | 0.068800 | 0.08480 | 0.095600 | 1.0959 |
| 332 | 1578.0 | 2.579610 | 5.614570 | 0.5373 | 1.540750 | 2.05285 | 2.788075 | 174.8944 |
| 333 | 1579.0 | 6.210241 | 3.385299 | 2.8372 | 5.449200 | 5.97760 | 6.547150 | 90.5159 |
| 334 | 1579.0 | 0.169084 | 0.172425 | 0.0282 | 0.089700 | 0.12980 | 0.213300 | 3.4125 |
| 335 | 1579.0 | 3.423275 | 5.749029 | 0.7899 | 2.039950 | 2.51760 | 3.360400 | 172.7119 |
| 336 | 1578.0 | 9.728759 | 7.514222 | 5.2151 | 8.288175 | 9.06965 | 10.040175 | 214.8628 |
| 337 | 1578.0 | 2.316641 | 1.693277 | 0.0000 | 1.530525 | 2.04395 | 2.772925 | 38.8995 |
| 338 | 1578.0 | 3.025082 | 5.614349 | 0.0000 | 1.898225 | 2.55315 | 3.389025 | 196.6880 |
| 339 | 1579.0 | 9.310677 | 6.046416 | 2.2001 | 7.518050 | 9.46210 | 10.438250 | 197.4988 |
| 340 | 1579.0 | 14.507326 | 260.245969 | 0.0131 | 0.034900 | 0.04670 | 0.067200 | 5043.8789 |
| 341 | 1579.0 | 2.737962 | 3.649245 | 0.5741 | 1.919450 | 2.38570 | 2.996550 | 97.7089 |
| 342 | 1579.0 | 0.000283 | 0.011254 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.4472 |
| 343 | 1579.0 | 6.191544 | 5.345799 | 1.2565 | 4.983100 | 6.00250 | 6.886700 | 156.3360 |
| 344 | 1579.0 | 23.281035 | 8.901514 | 2.0560 | 17.949650 | 23.33370 | 28.904100 | 59.3241 |
| 345 | 781.0 | 7.956686 | 17.426719 | 1.7694 | 4.451000 | 5.56730 | 6.835100 | 257.0106 |
| 346 | 781.0 | 5.779293 | 16.994432 | 1.0177 | 2.536500 | 3.05910 | 4.141300 | 187.7589 |
| 347 | 1579.0 | 0.008812 | 0.350173 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 13.9147 |
| 348 | 1561.0 | 0.024707 | 0.011867 | 0.0103 | 0.018000 | 0.02260 | 0.027400 | 0.2200 |
| 349 | 1561.0 | 0.025243 | 0.010578 | 0.0010 | 0.019600 | 0.02400 | 0.028600 | 0.1339 |
| 350 | 1561.0 | 0.023349 | 0.014406 | 0.0029 | 0.014600 | 0.01890 | 0.029000 | 0.2914 |
| 351 | 1561.0 | 0.028025 | 0.028680 | 0.0020 | 0.016600 | 0.02540 | 0.034000 | 0.6188 |
| 352 | 1561.0 | 0.023305 | 0.013103 | 0.0056 | 0.016000 | 0.02190 | 0.026900 | 0.1429 |
| 353 | 1561.0 | 0.040308 | 0.015498 | 0.0026 | 0.030200 | 0.04220 | 0.050200 | 0.1535 |
| 354 | 1561.0 | 0.041828 | 0.013101 | 0.0040 | 0.034700 | 0.04410 | 0.050000 | 0.1344 |
| 355 | 1561.0 | 0.034707 | 0.022903 | 0.0038 | 0.021200 | 0.02940 | 0.042500 | 0.2789 |
| 356 | 1584.0 | 1.298296 | 0.387207 | 0.3796 | 1.025225 | 1.25490 | 1.533175 | 2.8348 |
| 357 | 1573.0 | 0.001000 | 0.000504 | 0.0003 | 0.000700 | 0.00090 | 0.001100 | 0.0052 |
| 358 | 226.0 | 0.002443 | 0.000395 | 0.0017 | 0.002200 | 0.00240 | 0.002700 | 0.0047 |
| 359 | 1585.0 | 0.019903 | 0.007146 | 0.0076 | 0.013900 | 0.01970 | 0.025000 | 0.0888 |
| 360 | 1585.0 | 0.002944 | 0.019893 | 0.0001 | 0.000400 | 0.00070 | 0.001800 | 0.4090 |
| 361 | 1585.0 | 39.941024 | 16.998644 | 10.7204 | 32.177700 | 39.74260 | 46.977500 | 547.1722 |
| 362 | 1534.0 | 0.018864 | 0.024954 | 0.0028 | 0.009500 | 0.01250 | 0.018800 | 0.4163 |
| 363 | 1534.0 | 333.925746 | 138.786411 | 60.9882 | 229.157400 | 310.77660 | 412.709800 | 1072.2031 |
| 364 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 365 | 1583.0 | 0.005226 | 0.002676 | 0.0017 | 0.003800 | 0.00460 | 0.005800 | 0.0368 |
| 366 | 1583.0 | 0.004836 | 0.002401 | 0.0020 | 0.003600 | 0.00430 | 0.005400 | 0.0392 |
| 367 | 1579.0 | 0.003786 | 0.002703 | 0.0000 | 0.002600 | 0.00320 | 0.004200 | 0.0357 |
| 368 | 1579.0 | 0.003186 | 0.002120 | 0.0000 | 0.002200 | 0.00280 | 0.003600 | 0.0334 |
| 369 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 370 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 371 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 372 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 373 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 374 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 375 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 376 | 1583.0 | 0.001604 | 0.000535 | 0.0004 | 0.001300 | 0.00160 | 0.001900 | 0.0082 |
| 377 | 1583.0 | 0.001571 | 0.000466 | 0.0004 | 0.001300 | 0.00150 | 0.001800 | 0.0077 |
| 378 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 379 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 380 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 381 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 382 | 549.0 | 0.001826 | 0.026740 | 0.0001 | 0.000400 | 0.00050 | 0.000800 | 0.6271 |
| 383 | 549.0 | 0.541040 | 1.341020 | 0.0875 | 0.295500 | 0.37260 | 0.541200 | 30.9982 |
| 384 | 549.0 | 1.285448 | 3.168427 | 0.3383 | 0.842300 | 1.10630 | 1.386600 | 74.8445 |
| 385 | 863.0 | 0.011429 | 0.014306 | 0.0000 | 0.005300 | 0.00680 | 0.011350 | 0.2073 |
| 386 | 1585.0 | 0.008299 | 0.015417 | 0.0008 | 0.004800 | 0.00680 | 0.009300 | 0.3068 |
| 387 | 1585.0 | 0.000335 | 0.004961 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.1309 |
| 388 | 1585.0 | 35.228095 | 17.220514 | 6.3101 | 24.395000 | 32.58130 | 42.749700 | 348.8293 |
| 389 | 1585.0 | 0.001330 | 0.011750 | 0.0001 | 0.000200 | 0.00030 | 0.000400 | 0.3127 |
| 390 | 1585.0 | 1.427360 | 20.210644 | 0.3046 | 0.676800 | 0.88460 | 1.147300 | 805.3936 |
| 391 | 1561.0 | 0.010954 | 0.006707 | 0.0031 | 0.008300 | 0.01020 | 0.012400 | 0.1375 |
| 392 | 1585.0 | 0.004535 | 0.002955 | 0.0005 | 0.001500 | 0.00490 | 0.006900 | 0.0229 |
| 393 | 1585.0 | 0.134091 | 0.038263 | 0.0342 | 0.105400 | 0.13400 | 0.160400 | 0.2994 |
| 394 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 395 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 396 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 397 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 398 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 399 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 400 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 401 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 402 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 403 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 404 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 405 | 1577.0 | 0.024074 | 0.010743 | 0.0062 | 0.013700 | 0.02370 | 0.032300 | 0.0514 |
| 406 | 1577.0 | 6.781749 | 2.862299 | 2.0545 | 4.574800 | 5.96630 | 8.727600 | 14.7277 |
| 407 | 1577.0 | 1.229856 | 0.364101 | 0.4240 | 0.965600 | 1.23530 | 1.416700 | 3.3128 |
| 408 | 1580.0 | 5.333058 | 2.565775 | 2.7378 | 4.127800 | 4.92120 | 5.780400 | 44.3100 |
| 409 | 1579.0 | 4.582688 | 1.772200 | 1.2163 | 3.021200 | 4.49770 | 5.929950 | 9.5765 |
| 410 | 1578.0 | 4.930879 | 2.118015 | 0.7342 | 3.269425 | 4.73960 | 6.453950 | 13.8071 |
| 411 | 1571.0 | 2.613332 | 0.549482 | 0.9609 | 2.314900 | 2.54180 | 2.848900 | 6.2150 |
| 412 | 1571.0 | 30.819750 | 18.343011 | 0.0000 | 18.353400 | 25.98370 | 38.006050 | 128.2816 |
| 413 | 1571.0 | 25.550500 | 47.056993 | 4.0416 | 11.325400 | 20.25510 | 29.307300 | 899.1190 |
| 414 | 1571.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 415 | 1571.0 | 6.624708 | 3.938918 | 1.5340 | 4.927400 | 6.17660 | 7.565100 | 116.8615 |
| 416 | 1576.0 | 3.400833 | 1.031382 | 0.0000 | 2.660100 | 3.23190 | 4.000700 | 9.6900 |
| 417 | 1583.0 | 8.172229 | 4.038289 | 2.1531 | 5.763450 | 7.37660 | 9.135250 | 39.0376 |
| 418 | 1583.0 | 320.469276 | 286.754983 | 0.0000 | 0.000000 | 302.51320 | 523.120400 | 999.3160 |
| 419 | 1583.0 | 308.739010 | 325.606713 | 0.0000 | 0.000000 | 270.80320 | 582.803100 | 998.6813 |
| 420 | 1583.0 | 1.830745 | 3.042484 | 0.4411 | 1.035400 | 1.65810 | 2.229950 | 111.4956 |
| 421 | 1583.0 | 4.189798 | 6.877874 | 0.7217 | 3.184850 | 3.95960 | 4.822850 | 273.0952 |
| 422 | 1582.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 423 | 1582.0 | 77.611340 | 32.519230 | 23.0200 | 55.976225 | 69.75830 | 92.765850 | 424.2152 |
| 424 | 1582.0 | 3.311764 | 6.289835 | 0.4866 | 1.966700 | 2.66780 | 3.475100 | 103.1809 |
| 425 | 1582.0 | 6.799220 | 23.128244 | 1.4666 | 3.773300 | 4.77255 | 6.895700 | 898.6085 |
| 426 | 1582.0 | 1.239004 | 0.992716 | 0.3632 | 0.747825 | 1.14000 | 1.542950 | 24.9904 |
| 427 | 1582.0 | 4.074912 | 3.033209 | 0.6637 | 3.113925 | 3.94855 | 4.781225 | 113.2230 |
| 428 | 1575.0 | 4.285630 | 10.958986 | 1.1198 | 1.955500 | 2.54610 | 3.633000 | 118.7533 |
| 429 | 1585.0 | 4.165833 | 6.399970 | 0.7837 | 2.575200 | 3.45430 | 4.754200 | 186.6164 |
| 430 | 1583.0 | 19.003998 | 37.875151 | 0.0000 | 7.024300 | 11.17940 | 17.648200 | 400.0000 |
| 431 | 1583.0 | 22.894285 | 38.225854 | 0.0000 | 11.069050 | 16.38970 | 21.867050 | 400.0000 |
| 432 | 1583.0 | 99.063646 | 125.965155 | 0.0000 | 30.690800 | 57.62670 | 120.136900 | 994.2857 |
| 433 | 1583.0 | 205.167427 | 225.125380 | 0.0000 | 10.113700 | 151.11560 | 304.541800 | 995.7447 |
| 434 | 1583.0 | 15.240866 | 36.045926 | 0.0000 | 7.551150 | 10.17600 | 12.755250 | 400.0000 |
| 435 | 1583.0 | 9.886985 | 36.345151 | 0.0000 | 3.480950 | 4.55000 | 5.824400 | 400.0000 |
| 436 | 1583.0 | 8.039302 | 36.553530 | 0.0000 | 1.939700 | 2.76260 | 3.826400 | 400.0000 |
| 437 | 1583.0 | 4.014245 | 1.605689 | 1.1568 | 3.071700 | 3.77200 | 4.679100 | 32.2740 |
| 438 | 1583.0 | 54.537440 | 33.963183 | 0.0000 | 36.252650 | 48.64860 | 66.359400 | 851.6129 |
| 439 | 1583.0 | 70.417478 | 38.240191 | 14.1206 | 47.976500 | 65.14190 | 84.717550 | 657.7621 |
| 440 | 1583.0 | 11.471934 | 6.181215 | 1.0973 | 5.280550 | 12.03510 | 15.733500 | 33.0580 |
| 441 | 1584.0 | 0.802996 | 0.184979 | 0.3512 | 0.679600 | 0.80760 | 0.932000 | 1.2771 |
| 442 | 1584.0 | 1.345077 | 0.662972 | 0.0974 | 0.886500 | 1.26290 | 1.579025 | 5.1317 |
| 443 | 1584.0 | 0.634833 | 0.143609 | 0.2169 | 0.550500 | 0.64350 | 0.740000 | 1.0851 |
| 444 | 1584.0 | 0.895299 | 0.155161 | 0.3336 | 0.806800 | 0.90330 | 0.989400 | 1.3511 |
| 445 | 1584.0 | 0.647958 | 0.141360 | 0.3086 | 0.558800 | 0.65110 | 0.748400 | 1.1087 |
| 446 | 1584.0 | 1.177249 | 0.177140 | 0.6968 | 1.048225 | 1.16390 | 1.275500 | 1.7639 |
| 447 | 1584.0 | 0.281940 | 0.086334 | 0.0846 | 0.226050 | 0.27990 | 0.338475 | 0.5085 |
| 448 | 1584.0 | 0.332852 | 0.236313 | 0.0399 | 0.187700 | 0.25120 | 0.356900 | 1.4754 |
| 449 | 1561.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 450 | 1561.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 451 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 452 | 1584.0 | 5.340494 | 0.919043 | 2.6709 | 4.758725 | 5.26125 | 5.908650 | 13.9776 |
| 453 | 1584.0 | 5.449790 | 2.246946 | 0.9037 | 3.737550 | 5.21720 | 6.888525 | 34.4902 |
| 454 | 1584.0 | 7.891142 | 3.050340 | 2.3294 | 5.812375 | 7.43390 | 9.581575 | 42.0703 |
| 455 | 1584.0 | 3.634978 | 0.938766 | 0.6948 | 2.897425 | 3.72045 | 4.342275 | 10.1840 |
| 456 | 1584.0 | 12.303435 | 8.088231 | 3.0489 | 8.811950 | 11.33570 | 14.376525 | 232.1258 |
| 457 | 1584.0 | 5.256749 | 4.515152 | 1.4428 | 3.825825 | 4.79015 | 6.088925 | 164.1093 |
| 458 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 459 | 1584.0 | 2.837330 | 1.340282 | 0.9910 | 2.291525 | 2.83220 | 3.309500 | 47.7772 |
| 460 | 1584.0 | 29.189075 | 13.287552 | 7.9534 | 20.260025 | 26.19270 | 35.223450 | 149.3851 |
| 461 | 1584.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 462 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 463 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 464 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 465 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 466 | 1581.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 467 | 1581.0 | 6.300817 | 8.962507 | 1.7163 | 4.697500 | 5.64340 | 6.373100 | 109.0074 |
| 468 | 1578.0 | 222.642360 | 230.298515 | 0.0000 | 37.357000 | 148.21055 | 333.597350 | 999.8770 |
| 469 | 1579.0 | 5.657781 | 3.135537 | 2.6009 | 4.843500 | 5.47240 | 6.005400 | 77.8007 |
| 470 | 1579.0 | 5.390852 | 4.973213 | 0.8325 | 2.824000 | 4.07460 | 7.033300 | 87.1347 |
| 471 | 1579.0 | 9.630655 | 10.118093 | 2.4026 | 5.836000 | 7.40910 | 9.747700 | 212.6557 |
| 472 | 1578.0 | 137.929773 | 47.841832 | 11.4997 | 105.199475 | 138.25515 | 168.465625 | 492.7718 |
| 473 | 1578.0 | 39.299439 | 22.408047 | 0.0000 | 24.782425 | 34.14105 | 47.687800 | 358.9504 |
| 474 | 1578.0 | 37.529709 | 24.740699 | 0.0000 | 23.111050 | 32.68760 | 45.165875 | 415.4355 |
| 475 | 1579.0 | 4.255407 | 2.599360 | 1.1011 | 3.470050 | 4.27530 | 4.741500 | 79.1162 |
| 476 | 1579.0 | 20.280023 | 15.043057 | 0.0000 | 11.595600 | 16.03320 | 23.845000 | 274.8871 |
| 477 | 1579.0 | 6.270334 | 10.132991 | 1.6872 | 4.121050 | 5.25460 | 6.738700 | 289.8264 |
| 478 | 1579.0 | 0.126662 | 5.033139 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 200.0000 |
| 479 | 1579.0 | 3.279725 | 2.626341 | 0.6459 | 2.623400 | 3.18340 | 3.625050 | 63.3336 |
| 480 | 1579.0 | 76.008389 | 36.031852 | 8.8406 | 52.999150 | 70.78790 | 94.398250 | 221.9747 |
| 481 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 482 | 1561.0 | 319.921206 | 281.602660 | 0.0000 | 0.000000 | 294.51020 | 515.823700 | 999.4135 |
| 483 | 1561.0 | 206.239130 | 192.299424 | 0.0000 | 81.355900 | 148.13030 | 262.857100 | 989.4737 |
| 484 | 1561.0 | 215.699477 | 214.738928 | 0.0000 | 75.862100 | 138.60530 | 296.593200 | 996.8586 |
| 485 | 1561.0 | 202.009071 | 219.554971 | 0.0000 | 50.799000 | 113.52010 | 288.961700 | 994.0000 |
| 486 | 1561.0 | 301.768000 | 287.323543 | 0.0000 | 0.000000 | 247.84410 | 501.573000 | 999.4911 |
| 487 | 1561.0 | 238.621822 | 263.509288 | 0.0000 | 55.494800 | 111.46260 | 397.366300 | 995.7447 |
| 488 | 1561.0 | 351.880772 | 252.016297 | 0.0000 | 139.891600 | 347.73660 | 510.599100 | 997.5186 |
| 489 | 1561.0 | 272.050198 | 227.579190 | 0.0000 | 112.870300 | 219.94880 | 376.579400 | 994.0035 |
| 490 | 1584.0 | 51.351319 | 18.087378 | 13.7225 | 38.370775 | 48.52840 | 61.476200 | 142.8436 |
| 491 | 1573.0 | 2.442662 | 1.230739 | 0.5558 | 1.747100 | 2.25080 | 2.839800 | 12.7698 |
| 492 | 226.0 | 8.170943 | 1.759262 | 4.8882 | 6.924650 | 8.00895 | 9.078900 | 21.0443 |
| 493 | 1585.0 | 2.538322 | 0.973779 | 0.8330 | 1.665400 | 2.57400 | 3.202900 | 9.4024 |
| 494 | 1585.0 | 0.956008 | 6.578942 | 0.0342 | 0.139000 | 0.23260 | 0.574100 | 127.5728 |
| 495 | 1585.0 | 6.810506 | 3.250016 | 1.7720 | 5.275000 | 6.62860 | 7.907200 | 107.6926 |
| 496 | 1534.0 | 30.010047 | 24.681167 | 4.8135 | 16.358200 | 22.16925 | 32.593850 | 219.6436 |
| 497 | 1534.0 | 11.838940 | 4.955132 | 1.9496 | 8.168375 | 10.93355 | 14.471350 | 40.2818 |
| 498 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 499 | 1583.0 | 263.135949 | 324.799275 | 0.0000 | 0.000000 | 0.00000 | 536.122600 | 1000.0000 |
| 500 | 1583.0 | 240.996718 | 323.288606 | 0.0000 | 0.000000 | 0.00000 | 506.166350 | 999.2337 |
| 501 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 502 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 503 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 504 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 505 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 506 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 507 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 508 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 509 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 510 | 1583.0 | 56.451377 | 38.508172 | 0.0000 | 35.381300 | 47.32650 | 64.785100 | 451.4851 |
| 511 | 1583.0 | 274.852408 | 329.230621 | 0.0000 | 0.000000 | 0.00000 | 552.554150 | 1000.0000 |
| 512 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 513 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 514 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 515 | 1579.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 516 | 549.0 | 0.678898 | 10.783880 | 0.0287 | 0.121500 | 0.17470 | 0.264900 | 252.8604 |
| 517 | 549.0 | 1.738902 | 4.890663 | 0.2880 | 0.890300 | 1.15430 | 1.759700 | 113.2758 |
| 518 | 549.0 | 1.806273 | 4.715894 | 0.4674 | 1.171200 | 1.58910 | 1.932800 | 111.3495 |
| 519 | 863.0 | 11.761329 | 15.771210 | 0.0000 | 4.173800 | 5.88270 | 11.035850 | 184.3488 |
| 520 | 1585.0 | 2.701225 | 5.675963 | 0.3121 | 1.556400 | 2.22550 | 2.906300 | 111.7365 |
| 521 | 1585.0 | 11.478230 | 102.542780 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 1000.0000 |
| 522 | 1585.0 | 14.752589 | 7.097899 | 2.6811 | 10.212900 | 13.74770 | 17.835500 | 137.9838 |
| 523 | 1585.0 | 0.451093 | 4.124320 | 0.0258 | 0.072900 | 0.09980 | 0.132900 | 111.3330 |
| 524 | 1585.0 | 5.685903 | 20.546003 | 1.3104 | 3.779000 | 4.88760 | 6.450000 | 818.0005 |
| 525 | 1561.0 | 5.559936 | 3.904061 | 1.5400 | 4.101600 | 5.13420 | 6.329600 | 80.0406 |
| 526 | 1585.0 | 1.443844 | 0.957816 | 0.1705 | 0.486200 | 1.55010 | 2.211700 | 8.2037 |
| 527 | 1585.0 | 6.398382 | 1.881284 | 2.1700 | 4.907700 | 6.41690 | 7.584600 | 14.4479 |
| 528 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 529 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 530 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 531 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 532 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 533 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 534 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 535 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 536 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 537 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 538 | 1576.0 | 0.000000 | 0.000000 | 0.0000 | 0.000000 | 0.00000 | 0.000000 | 0.0000 |
| 539 | 1577.0 | 3.018310 | 1.255013 | 0.8516 | 1.861500 | 3.01850 | 3.939600 | 6.5803 |
| 540 | 1577.0 | 1.955258 | 0.738998 | 0.6144 | 1.395900 | 1.79290 | 2.465900 | 4.0825 |
| 541 | 1577.0 | 9.598204 | 2.888613 | 3.2761 | 7.487300 | 9.45930 | 11.169100 | 25.7792 |
| 542 | 1583.0 | 0.111190 | 0.002726 | 0.1053 | 0.109600 | 0.10960 | 0.113400 | 0.1184 |
| 543 | 1583.0 | 0.008463 | 0.001526 | 0.0051 | 0.007800 | 0.00780 | 0.009000 | 0.0240 |
| 544 | 1583.0 | 0.002510 | 0.000294 | 0.0016 | 0.002400 | 0.00260 | 0.002600 | 0.0047 |
| 545 | 1583.0 | 7.605770 | 1.309094 | 4.4294 | 7.116000 | 7.11600 | 7.997200 | 21.0443 |
| 546 | 1324.0 | 1.039565 | 0.388464 | 0.4444 | 0.796400 | 0.91120 | 1.285475 | 3.9786 |
| 547 | 1324.0 | 403.536557 | 5.139674 | 372.8220 | 400.684000 | 403.12400 | 407.464000 | 421.7020 |
| 548 | 1324.0 | 75.681332 | 3.384824 | 71.0380 | 73.254000 | 74.11000 | 78.386500 | 83.7200 |
| 549 | 1324.0 | 0.662945 | 0.670912 | 0.0446 | 0.226525 | 0.47240 | 0.850650 | 7.0656 |
| 550 | 1324.0 | 17.087644 | 5.859657 | 6.1100 | 14.530000 | 16.32000 | 18.980000 | 131.6800 |
| 551 | 1324.0 | 1.257931 | 1.710921 | 0.1200 | 0.870000 | 1.15000 | 1.370000 | 39.3300 |
| 552 | 1324.0 | 0.276518 | 0.275159 | 0.0187 | 0.094900 | 0.19805 | 0.358225 | 2.7182 |
| 553 | 1324.0 | 7.735242 | 2.567560 | 2.7860 | 6.738100 | 7.40595 | 8.626000 | 56.9303 |
| 554 | 1324.0 | 0.515898 | 0.756264 | 0.0520 | 0.345275 | 0.47840 | 0.562325 | 17.4781 |
| 555 | 1324.0 | 57.753862 | 35.190290 | 4.8269 | 27.017600 | 54.43250 | 74.564875 | 303.5500 |
| 556 | 1324.0 | 4.237188 | 1.533317 | 1.4967 | 3.625100 | 4.06710 | 4.700325 | 35.3198 |
| 557 | 1324.0 | 1.660655 | 2.356256 | 0.1646 | 1.182900 | 1.52735 | 1.815600 | 54.2917 |
| 558 | 1584.0 | 0.995715 | 0.085656 | 0.8919 | 0.955200 | 0.97270 | 1.000800 | 1.5121 |
| 559 | 1584.0 | 0.326157 | 0.201719 | 0.0699 | 0.149700 | 0.29090 | 0.443750 | 1.0737 |
| 560 | 1584.0 | 0.072845 | 0.052343 | 0.0177 | 0.036200 | 0.05920 | 0.089500 | 0.4457 |
| 561 | 1584.0 | 32.295110 | 19.010921 | 7.2369 | 15.771350 | 29.73115 | 44.113400 | 101.1146 |
| 562 | 1305.0 | 262.737519 | 7.607158 | 242.2860 | 259.980000 | 264.27200 | 265.708000 | 311.4040 |
| 563 | 1305.0 | 0.679764 | 0.121710 | 0.3049 | 0.567100 | 0.65110 | 0.769900 | 1.2988 |
| 564 | 1305.0 | 6.435571 | 2.633566 | 0.9700 | 4.980000 | 5.16000 | 7.780000 | 32.5800 |
| 565 | 1305.0 | 0.145828 | 0.081962 | 0.0224 | 0.087700 | 0.11960 | 0.186300 | 0.6892 |
| 566 | 1305.0 | 2.607670 | 1.033128 | 0.4122 | 2.090200 | 2.15300 | 3.105000 | 14.0141 |
| 567 | 1305.0 | 0.060163 | 0.033119 | 0.0091 | 0.038200 | 0.04860 | 0.075200 | 0.2932 |
| 568 | 1305.0 | 2.448761 | 0.996600 | 0.3706 | 1.884400 | 1.99850 | 2.961500 | 12.7462 |
| 569 | 1305.0 | 21.135136 | 10.269737 | 3.2504 | 15.466200 | 17.05500 | 24.765800 | 84.8024 |
| 570 | 1585.0 | 530.537839 | 17.431658 | 317.1964 | 530.703600 | 532.42270 | 534.356400 | 589.5082 |
| 571 | 1585.0 | 2.102700 | 0.274126 | 0.9802 | 1.983300 | 2.12010 | 2.290400 | 2.7395 |
| 572 | 1585.0 | 28.394756 | 86.066831 | 3.5400 | 7.500000 | 8.67000 | 10.130000 | 454.5600 |
| 573 | 1585.0 | 0.344791 | 0.247348 | 0.0667 | 0.239500 | 0.29340 | 0.366500 | 2.1967 |
| 574 | 1585.0 | 9.147274 | 26.856758 | 1.0395 | 2.566500 | 2.97360 | 3.492500 | 170.0204 |
| 575 | 1585.0 | 0.104495 | 0.067491 | 0.0230 | 0.075100 | 0.08950 | 0.112100 | 0.5502 |
| 576 | 1585.0 | 5.553429 | 16.878427 | 0.6636 | 1.409100 | 1.62450 | 1.902000 | 90.4235 |
| 577 | 1585.0 | 16.594504 | 12.428400 | 4.5820 | 11.477900 | 13.79630 | 17.080900 | 96.9601 |
| 578 | 631.0 | 0.021680 | 0.011760 | -0.0169 | 0.013850 | 0.02040 | 0.027700 | 0.1028 |
| 579 | 631.0 | 0.016909 | 0.009654 | 0.0032 | 0.010700 | 0.01490 | 0.020100 | 0.0799 |
| 580 | 631.0 | 0.005418 | 0.003114 | 0.0010 | 0.003400 | 0.00470 | 0.006500 | 0.0286 |
| 581 | 631.0 | 97.844812 | 86.820188 | 0.0000 | 46.458350 | 73.15020 | 115.607150 | 737.3048 |
| 582 | 1584.0 | 0.500078 | 0.003446 | 0.4778 | 0.497900 | 0.50015 | 0.502300 | 0.5098 |
| 583 | 1584.0 | 0.015611 | 0.020658 | 0.0060 | 0.011600 | 0.01380 | 0.016500 | 0.4766 |
| 584 | 1584.0 | 0.003912 | 0.004484 | 0.0017 | 0.003100 | 0.00360 | 0.004100 | 0.1045 |
| 585 | 1584.0 | 3.129034 | 4.304567 | 1.1975 | 2.305900 | 2.76035 | 3.295575 | 99.3032 |
| 586 | 1583.0 | 0.021490 | 0.012354 | -0.0169 | 0.013500 | 0.02050 | 0.027600 | 0.1028 |
| 587 | 1583.0 | 0.016503 | 0.008813 | 0.0032 | 0.010600 | 0.01480 | 0.020300 | 0.0799 |
| 588 | 1583.0 | 0.005291 | 0.002866 | 0.0010 | 0.003350 | 0.00460 | 0.006400 | 0.0286 |
| 589 | 1583.0 | 99.527550 | 93.484665 | 0.0000 | 44.368600 | 72.10940 | 114.749700 | 737.3048 |
| Pass/Fail | 1585.0 | -0.749527 | 1.326108 | -1.0000 | -1.000000 | -1.00000 | -1.000000 | 18.0000 |
#dropping the columns the have constant signal
cols = com.select_dtypes([np.number]).columns
std = com[cols].std()
cols_to_drop = std[std==0].index
com.drop(cols_to_drop, axis=1,inplace=True)
com.head()
| 0 | 1 | 2 | 3 | 4 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 43 | 44 | 45 | 46 | 47 | 48 | 50 | 51 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | 158 | 159 | 160 | 161 | 162 | 163 | 164 | 165 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | 177 | 180 | 181 | 182 | 183 | 184 | 185 | 187 | 188 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | 227 | 228 | 238 | 239 | 244 | 245 | 246 | 247 | 248 | 249 | 250 | 251 | 252 | 253 | 254 | 255 | 267 | 268 | 269 | 270 | 271 | 272 | 273 | 274 | 275 | 277 | 278 | 279 | 280 | 281 | 282 | 283 | 285 | 286 | 287 | 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | 300 | 301 | 302 | 303 | 304 | 305 | 306 | 307 | 308 | 309 | 310 | 311 | 312 | 316 | 317 | 318 | 319 | 320 | 321 | 323 | 324 | 331 | 332 | 333 | 334 | 335 | 336 | 337 | 338 | 339 | 340 | 341 | 342 | 343 | 344 | 345 | 346 | 347 | 348 | 349 | 350 | 351 | 352 | 353 | 354 | 355 | 356 | 357 | 358 | 359 | 360 | 361 | 362 | 363 | 365 | 366 | 367 | 368 | 376 | 377 | 382 | 383 | 384 | 385 | 386 | 387 | 388 | 389 | 390 | 391 | 392 | 393 | 405 | 406 | 407 | 408 | 409 | 410 | 411 | 412 | 413 | 415 | 416 | 417 | 418 | 419 | 420 | 421 | 423 | 424 | 425 | 426 | 427 | 428 | 429 | 430 | 431 | 432 | 433 | 434 | 435 | 436 | 437 | 438 | 439 | 440 | 441 | 442 | 443 | 444 | 445 | 446 | 447 | 448 | 452 | 453 | 454 | 455 | 456 | 457 | 459 | 460 | 467 | 468 | 469 | 470 | 471 | 472 | 473 | 474 | 475 | 476 | 477 | 478 | 479 | 480 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 490 | 491 | 492 | 493 | 494 | 495 | 496 | 497 | 499 | 500 | 510 | 511 | 516 | 517 | 518 | 519 | 520 | 521 | 522 | 523 | 524 | 525 | 526 | 527 | 539 | 540 | 541 | 542 | 543 | 544 | 545 | 546 | 547 | 548 | 549 | 550 | 551 | 552 | 553 | 554 | 555 | 556 | 557 | 558 | 559 | 560 | 561 | 562 | 563 | 564 | 565 | 566 | 567 | 568 | 569 | 570 | 571 | 572 | 573 | 574 | 575 | 576 | 577 | 578 | 579 | 580 | 581 | 582 | 583 | 584 | 585 | 586 | 587 | 588 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 3030.93 | 2564.00 | 2187.7333 | 1411.1265 | 1.3602 | 97.6133 | 0.1242 | 1.5005 | 0.0162 | -0.0034 | 0.9455 | 202.4396 | 7.9558 | 414.8710 | 10.0433 | 0.9680 | 192.3963 | 12.5190 | 1.4026 | -5419.00 | 2916.50 | -4043.75 | 751.00 | 0.8955 | 1.7730 | 3.0490 | 64.2333 | 2.0222 | 0.1632 | 3.5191 | 83.3971 | 9.5126 | 50.6170 | 64.2588 | 49.3830 | 66.3141 | 86.9555 | 117.5132 | 61.29 | 4.515 | 352.7173 | 10.1841 | 130.3691 | 723.3092 | 1.3072 | 141.2282 | 624.3145 | 218.3174 | 4.592 | 4.841 | 2834.0 | 0.9317 | 0.9484 | 4.7057 | -1.7264 | 350.9264 | 10.6231 | 108.6427 | 16.1445 | 21.7264 | 29.5367 | 693.7724 | 0.9226 | 148.6009 | 608.1700 | 84.0793 | NaN | NaN | 0.0 | 0.0126 | -0.0206 | 0.0141 | -0.0307 | -0.0083 | -0.0026 | -0.0567 | -0.0044 | 7.2163 | 0.1320 | NaN | 2.3895 | 0.9690 | 1747.6049 | 0.1841 | 8671.9301 | -0.3274 | -0.0055 | -0.0001 | 0.0001 | 0.0003 | -0.2786 | 0.3974 | -0.0251 | 0.0002 | 0.0002 | 0.1350 | -0.0042 | 0.0003 | 0.0056 | 0.0000 | -0.2468 | 0.3196 | NaN | NaN | NaN | NaN | 0.9460 | 0.0 | 748.6115 | 0.9908 | 58.4306 | 0.6002 | 0.9804 | 6.3788 | 15.88 | 2.639 | 15.94 | 15.93 | 0.8656 | 3.353 | 0.4098 | 3.188 | -0.0473 | 0.7243 | 0.9960 | 2.2967 | 1000.7263 | 39.2373 | 123.0 | 111.3 | 75.2 | 46.2000 | 350.6710 | 0.3948 | 6.78 | 0.0034 | 0.0898 | 0.0850 | 0.0358 | 0.0328 | 12.2566 | 4.271 | 10.284 | 0.4734 | 0.0167 | 11.8901 | 0.41 | 0.0506 | NaN | NaN | 1017.0 | 967.0 | 1066.0 | 368.0 | 0.090 | 0.048 | 0.095 | 2.0 | 0.9 | 0.069 | 0.046 | 0.7250 | 0.1139 | 0.3183 | 0.5888 | 0.3184 | 0.9499 | 0.3979 | 0.160 | 20.95 | 0.333 | 12.49 | 16.713 | 0.0803 | 5.72 | 11.19 | 65.363 | 0.292 | 5.38 | 20.10 | 0.296 | 10.62 | 10.30 | 5.38 | 4.040 | 16.230 | 0.2951 | 8.64 | 0.0 | 10.30 | 97.314 | 0.0 | 0.0772 | 0.0599 | 0.0700 | 0.0547 | 0.0704 | 0.0520 | 0.0301 | 0.1135 | 3.4789 | 0.0010 | NaN | 0.0707 | 0.0211 | 175.2173 | 0.0315 | 1940.3994 | 0.0744 | 0.0546 | 0.0027 | 0.0040 | NaN | NaN | NaN | NaN | 0.0188 | 0.0 | 219.9453 | 0.0011 | 2.8374 | 0.0189 | 0.0050 | 0.4269 | 0.0472 | 40.855 | 4.5152 | 30.9815 | 33.9606 | 22.9057 | 15.9525 | 110.2144 | 0.1310 | 2.5883 | 0.0010 | 0.0319 | 0.0197 | 0.0120 | 0.0109 | 3.9321 | 1.5123 | 3.5811 | 0.1337 | 0.0055 | 3.8447 | 0.1077 | 0.0167 | NaN | NaN | 418.1363 | 398.3185 | 496.1582 | 158.3330 | 0.0373 | 0.0202 | 0.0462 | 0.6083 | 0.3032 | 0.0200 | 0.0174 | 0.2827 | 0.0434 | 0.1342 | 0.2419 | 0.1343 | 0.3670 | 0.1431 | 0.0610 | 6.2698 | 0.1181 | 3.8208 | 5.3737 | 0.0254 | 1.6252 | 3.2461 | 18.0118 | 0.0752 | 1.5989 | 6.5893 | 0.0913 | 3.0911 | 8.4654 | 1.5989 | 1.2293 | 5.3406 | 0.0867 | 2.8551 | 0.0 | 2.9971 | 31.8843 | NaN | NaN | 0.0 | 0.0215 | 0.0274 | 0.0315 | 0.0238 | 0.0206 | 0.0238 | 0.0144 | 0.0491 | 1.2708 | 0.0004 | NaN | 0.0229 | 0.0065 | 55.2039 | 0.0105 | 560.2658 | 0.0170 | 0.0148 | 0.0124 | 0.0114 | 0.0010 | 0.0013 | NaN | NaN | NaN | NaN | 0.0055 | 0.0 | 61.5932 | 0.0003 | 0.9967 | 0.0082 | 0.0017 | 0.1437 | 0.0151 | 14.2396 | 1.4392 | 5.6188 | 3.6721 | 2.9329 | 2.1118 | 24.8504 | 29.0271 | 6.9458 | 2.7380 | 5.9846 | 525.0965 | 0.0000 | 3.4641 | 6.0544 | 53.6840 | 2.4788 | 4.7141 | 1.7275 | 6.1800 | 3.2750 | 3.6084 | 18.7673 | 33.1562 | 26.3617 | 49.0013 | 10.0503 | 2.7073 | 3.1158 | 3.1136 | 44.5055 | 42.2737 | 1.3071 | 0.8693 | 1.1975 | 0.6288 | 0.9163 | 0.6448 | 1.4324 | 0.4576 | 0.1362 | 5.9396 | 3.2698 | 9.5805 | 2.3106 | 6.1463 | 4.0502 | 1.7924 | 29.9394 | 6.2052 | 311.6377 | 5.7277 | 2.7864 | 9.7752 | 63.7987 | 24.7625 | 13.6778 | 2.3394 | 31.9893 | 5.8142 | 0.0 | 1.6936 | 115.7408 | 613.3069 | 291.4842 | 494.6996 | 178.1759 | 843.1138 | 0.0000 | 53.1098 | 0.0000 | 48.2091 | 0.7578 | NaN | 2.9570 | 2.1739 | 10.0261 | 17.1202 | 22.3756 | 0.0000 | 0.0000 | 64.6707 | 0.0000 | NaN | NaN | NaN | NaN | 1.9864 | 0.0 | 29.3804 | 0.1094 | 4.8560 | 3.1406 | 0.5064 | 6.6926 | 2.0570 | 4.0825 | 11.5074 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.0616 | 395.570 | 75.752 | 0.4234 | 12.93 | 0.78 | 0.1827 | 5.7349 | 0.3363 | 39.8842 | 3.2687 | 1.0297 | 1.0344 | 0.4385 | 0.1039 | 42.3877 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 533.8500 | 2.1113 | 8.95 | 0.3157 | 3.0624 | 0.1026 | 1.6765 | 14.9509 | NaN | NaN | NaN | NaN | 0.5005 | 0.0118 | 0.0035 | 2.3630 | NaN | NaN | NaN | NaN | -1 |
| 1 | 3095.78 | 2465.14 | 2230.4222 | 1463.6606 | 0.8294 | 102.3433 | 0.1247 | 1.4966 | -0.0005 | -0.0148 | 0.9627 | 200.5470 | 10.1548 | 414.7347 | 9.2599 | 0.9701 | 191.2872 | 12.4608 | 1.3825 | -5441.50 | 2604.25 | -3498.75 | -1640.25 | 1.2973 | 2.0143 | 7.3900 | 68.4222 | 2.2667 | 0.2102 | 3.4171 | 84.9052 | 9.7997 | 50.6596 | 64.2828 | 49.3404 | 64.9193 | 87.5241 | 118.1188 | 78.25 | 2.773 | 352.2445 | 10.0373 | 133.1727 | 724.8264 | 1.2887 | 145.8445 | 631.2618 | 205.1695 | 4.590 | 4.842 | 2853.0 | 0.9324 | 0.9479 | 4.6820 | 0.8073 | 352.0073 | 10.3092 | 113.9800 | 10.9036 | 19.1927 | 27.6301 | 697.1964 | 1.1598 | 154.3709 | 620.3582 | 82.3494 | NaN | NaN | 0.0 | -0.0039 | -0.0198 | 0.0004 | -0.0440 | -0.0358 | -0.0120 | -0.0377 | 0.0017 | 6.8043 | 0.1358 | NaN | 2.3754 | 0.9894 | 1931.6464 | 0.1874 | 8407.0299 | 0.1455 | -0.0015 | 0.0000 | -0.0005 | 0.0001 | 0.5854 | -0.9353 | -0.0158 | -0.0004 | -0.0004 | -0.0752 | -0.0045 | 0.0002 | 0.0015 | 0.0000 | 0.0772 | -0.0903 | NaN | NaN | NaN | NaN | 0.9425 | 0.0 | 731.2517 | 0.9902 | 58.6680 | 0.5958 | 0.9731 | 6.5061 | 15.88 | 2.541 | 15.91 | 15.88 | 0.8703 | 2.771 | 0.4138 | 3.272 | -0.0946 | 0.8122 | 0.9985 | 2.2932 | 998.1081 | 37.9213 | 98.0 | 80.3 | 81.0 | 56.2000 | 219.7679 | 0.2301 | 5.70 | 0.0049 | 0.1356 | 0.0600 | 0.0547 | 0.0204 | 12.3319 | 6.285 | 13.077 | 0.5666 | 0.0144 | 11.8428 | 0.35 | 0.0437 | NaN | NaN | 568.0 | 59.0 | 297.0 | 3277.0 | 0.112 | 0.115 | 0.124 | 2.2 | 1.1 | 0.079 | 0.561 | 1.0498 | 0.1917 | 0.4115 | 0.6582 | 0.4115 | 1.0181 | 0.2315 | 0.325 | 17.99 | 0.439 | 10.14 | 16.358 | 0.0892 | 6.92 | 9.05 | 82.986 | 0.222 | 3.74 | 19.59 | 0.316 | 11.65 | 8.02 | 3.74 | 3.659 | 15.078 | 0.3580 | 8.96 | 0.0 | 8.02 | 134.250 | 0.0 | 0.0566 | 0.0488 | 0.1651 | 0.1578 | 0.0468 | 0.0987 | 0.0734 | 0.0747 | 3.9578 | 0.0050 | NaN | 0.0761 | 0.0014 | 128.4285 | 0.0238 | 1988.0000 | 0.0203 | 0.0236 | 0.0064 | 0.0036 | NaN | NaN | NaN | NaN | 0.0154 | 0.0 | 193.0287 | 0.0007 | 3.8999 | 0.0187 | 0.0086 | 0.5749 | 0.0411 | 29.743 | 3.6327 | 29.0598 | 28.9862 | 22.3163 | 17.4008 | 83.5542 | 0.0767 | 1.8459 | 0.0012 | 0.0440 | 0.0171 | 0.0154 | 0.0069 | 3.9011 | 2.1016 | 3.9483 | 0.1662 | 0.0049 | 3.7836 | 0.1000 | 0.0139 | NaN | NaN | 233.9865 | 26.5879 | 139.2082 | 1529.7622 | 0.0502 | 0.0561 | 0.0591 | 0.8151 | 0.3464 | 0.0291 | 0.1822 | 0.3814 | 0.0715 | 0.1667 | 0.2630 | 0.1667 | 0.3752 | 0.0856 | 0.1214 | 5.6522 | 0.1417 | 2.9939 | 5.2445 | 0.0264 | 1.8045 | 2.7661 | 23.6230 | 0.0778 | 1.1506 | 5.9247 | 0.0878 | 3.3604 | 7.7421 | 1.1506 | 1.1265 | 5.0108 | 0.1013 | 2.4278 | 0.0 | 2.4890 | 41.7080 | NaN | NaN | 0.0 | 0.0142 | 0.0230 | 0.0768 | 0.0729 | 0.0143 | 0.0513 | 0.0399 | 0.0365 | 1.2474 | 0.0017 | NaN | 0.0248 | 0.0005 | 46.3453 | 0.0069 | 677.1873 | 0.0053 | 0.0059 | 0.0081 | 0.0033 | 0.0022 | 0.0013 | NaN | NaN | NaN | NaN | 0.0049 | 0.0 | 65.0999 | 0.0002 | 1.1655 | 0.0068 | 0.0027 | 0.1921 | 0.0120 | 10.5837 | 1.0323 | 4.3465 | 2.5939 | 3.2858 | 2.5197 | 15.0150 | 27.7464 | 5.5695 | 3.9300 | 9.0604 | 0.0000 | 368.9713 | 2.1196 | 6.1491 | 61.8918 | 3.1531 | 6.1188 | 1.4857 | 6.1911 | 2.8088 | 3.1595 | 10.4383 | 2.2655 | 8.4887 | 199.7866 | 8.6336 | 5.7093 | 1.6779 | 3.2153 | 48.5294 | 37.5793 | 16.4174 | 1.2364 | 1.9562 | 0.8123 | 1.0239 | 0.8340 | 1.5683 | 0.2645 | 0.2751 | 5.1072 | 4.3737 | 7.6142 | 2.2568 | 6.9233 | 4.7448 | 1.4336 | 40.4475 | 4.7415 | 463.2883 | 5.5652 | 3.0652 | 10.2211 | 73.5536 | 19.4865 | 13.2430 | 2.1627 | 30.8643 | 5.8042 | 0.0 | 1.2928 | 163.0249 | 0.0000 | 246.7762 | 0.0000 | 359.0444 | 130.6350 | 820.7900 | 194.4371 | 0.0000 | 58.1666 | 3.6822 | NaN | 3.2029 | 0.1441 | 6.6487 | 12.6788 | 23.6469 | 0.0000 | 0.0000 | 141.4365 | 0.0000 | NaN | NaN | NaN | NaN | 1.6292 | 0.0 | 26.3970 | 0.0673 | 6.6475 | 3.1310 | 0.8832 | 8.8370 | 1.7910 | 2.9799 | 9.5796 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.3526 | 408.798 | 74.640 | 0.7193 | 16.00 | 1.33 | 0.2829 | 7.1196 | 0.4989 | 53.1836 | 3.9139 | 1.7819 | 0.9634 | 0.1745 | 0.0375 | 18.1087 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 535.0164 | 2.4335 | 5.92 | 0.2653 | 2.0111 | 0.0772 | 1.1065 | 10.9003 | 0.0096 | 0.0201 | 0.0060 | 208.2045 | 0.5019 | 0.0223 | 0.0055 | 4.4447 | 0.0096 | 0.0201 | 0.0060 | 208.2045 | -1 |
| 2 | 2932.61 | 2559.94 | 2186.4111 | 1698.0172 | 1.5102 | 95.4878 | 0.1241 | 1.4436 | 0.0041 | 0.0013 | 0.9615 | 202.0179 | 9.5157 | 416.7075 | 9.3144 | 0.9674 | 192.7035 | 12.5404 | 1.4123 | -5447.75 | 2701.75 | -4047.00 | -1916.50 | 1.3122 | 2.0295 | 7.5788 | 67.1333 | 2.3333 | 0.1734 | 3.5986 | 84.7569 | 8.6590 | 50.1530 | 64.1114 | 49.8470 | 65.8389 | 84.7327 | 118.6128 | 14.37 | 5.434 | 364.3782 | 9.8783 | 131.8027 | 734.7924 | 1.2992 | 141.0845 | 637.2655 | 185.7574 | 4.486 | 4.748 | 2936.0 | 0.9139 | 0.9447 | 4.5873 | 23.8245 | 364.5364 | 10.1685 | 115.6273 | 11.3019 | 16.1755 | 24.2829 | 710.5095 | 0.8694 | 145.8000 | 625.9636 | 84.7681 | 140.6972 | 485.2665 | 0.0 | -0.0078 | -0.0326 | -0.0052 | 0.0213 | -0.0054 | -0.1134 | -0.0182 | 0.0287 | 7.1041 | 0.1362 | NaN | 2.4532 | 0.9880 | 1685.8514 | 0.1497 | 9317.1698 | 0.0553 | 0.0006 | -0.0013 | 0.0000 | 0.0002 | -0.1343 | -0.1427 | 0.1218 | 0.0006 | -0.0001 | 0.0134 | -0.0026 | -0.0016 | -0.0006 | 0.0013 | -0.0301 | -0.0728 | NaN | NaN | NaN | 0.4684 | 0.9231 | 0.0 | 718.5777 | 0.9899 | 58.4808 | 0.6015 | 0.9772 | 6.4527 | 15.90 | 2.882 | 15.94 | 15.95 | 0.8798 | 3.094 | 0.4777 | 3.272 | -0.1892 | 0.8194 | 0.9978 | 2.2592 | 998.4440 | 42.0579 | 89.0 | 126.4 | 96.5 | 45.1001 | 306.0380 | 0.3263 | 8.33 | 0.0038 | 0.0754 | 0.0483 | 0.0619 | 0.0221 | 8.2660 | 4.819 | 8.443 | 0.4909 | 0.0177 | 8.2054 | 0.47 | 0.0497 | NaN | NaN | 562.0 | 788.0 | 759.0 | 2100.0 | 0.187 | 0.117 | 0.068 | 2.1 | 1.4 | 0.123 | 0.319 | 1.0824 | 0.0369 | 0.3141 | 0.5753 | 0.3141 | 0.9677 | 0.2706 | 0.326 | 17.78 | 0.745 | 13.31 | 22.912 | 0.1959 | 9.21 | 17.87 | 60.110 | 0.139 | 5.09 | 19.75 | 0.949 | 9.71 | 16.73 | 5.09 | 11.059 | 22.624 | 0.1164 | 13.30 | 0.0 | 16.73 | 79.618 | 0.0 | 0.0339 | 0.0494 | 0.0696 | 0.0406 | 0.0401 | 0.0840 | 0.0349 | 0.0718 | 2.4266 | 0.0014 | NaN | 0.0963 | 0.0152 | 182.4956 | 0.0284 | 839.6006 | 0.0192 | 0.0170 | 0.0062 | 0.0040 | NaN | NaN | NaN | 0.1729 | 0.0273 | 0.0 | 104.4042 | 0.0007 | 4.1446 | 0.0733 | 0.0063 | 0.4166 | 0.0487 | 29.621 | 3.9133 | 23.5510 | 41.3837 | 32.6256 | 15.7716 | 97.3868 | 0.1117 | 2.5274 | 0.0012 | 0.0249 | 0.0152 | 0.0157 | 0.0075 | 2.8705 | 1.5306 | 2.5493 | 0.1479 | 0.0059 | 2.8046 | 0.1185 | 0.0167 | NaN | NaN | 251.4536 | 329.6406 | 325.0672 | 902.4576 | 0.0800 | 0.0583 | 0.0326 | 0.6964 | 0.4031 | 0.0416 | 0.1041 | 0.3846 | 0.0151 | 0.1288 | 0.2268 | 0.1288 | 0.3677 | 0.1175 | 0.1261 | 5.7247 | 0.2682 | 3.8541 | 6.1797 | 0.0546 | 2.5680 | 4.6067 | 16.0104 | 0.0243 | 1.5481 | 5.9453 | 0.2777 | 3.1600 | 8.9855 | 1.5481 | 2.9844 | 6.2277 | 0.0353 | 3.7663 | 0.0 | 5.6983 | 24.7959 | 13.5664 | 15.4488 | 0.0 | 0.0105 | 0.0208 | 0.0327 | 0.0171 | 0.0116 | 0.0428 | 0.0154 | 0.0383 | 0.7786 | 0.0005 | NaN | 0.0302 | 0.0046 | 58.0575 | 0.0092 | 283.6616 | 0.0054 | 0.0043 | 0.0030 | 0.0037 | 0.0021 | 0.0015 | NaN | NaN | NaN | 0.0221 | 0.0100 | 0.0 | 28.7334 | 0.0003 | 1.2356 | 0.0190 | 0.0020 | 0.1375 | 0.0190 | 11.4871 | 1.1798 | 4.0782 | 4.3102 | 3.7696 | 2.0627 | 18.0233 | 21.6062 | 8.7236 | 3.0609 | 5.2231 | 0.0000 | 0.0000 | 2.2943 | 4.0917 | 50.6425 | 2.0261 | 5.2707 | 1.8268 | 4.2581 | 3.7479 | 3.5220 | 10.3162 | 29.1663 | 18.7546 | 109.5747 | 14.2503 | 5.7650 | 0.8972 | 3.1281 | 60.0000 | 70.9161 | 8.8647 | 1.2771 | 0.4264 | 0.6263 | 0.8973 | 0.6301 | 1.4698 | 0.3194 | 0.2748 | 4.8795 | 7.5418 | 10.0984 | 3.1182 | 15.0790 | 6.5280 | 2.8042 | 32.3594 | 3.0301 | 21.3645 | 5.4178 | 9.3327 | 8.3977 | 148.0287 | 31.4674 | 45.5423 | 3.1842 | 13.3923 | 9.1221 | 0.0 | 2.6727 | 93.9245 | 434.2674 | 151.7665 | 0.0000 | 190.3869 | 746.9150 | 74.0741 | 191.7582 | 250.1742 | 34.1573 | 1.0281 | NaN | 3.9238 | 1.5357 | 10.8251 | 18.9849 | 9.0113 | 0.0000 | 0.0000 | 240.7767 | 244.2748 | NaN | NaN | NaN | 36.9067 | 2.9626 | 0.0 | 14.5293 | 0.0751 | 7.0870 | 12.1831 | 0.6451 | 6.4568 | 2.1538 | 2.9667 | 9.3046 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 0.7942 | 411.136 | 74.654 | 0.1832 | 16.16 | 0.85 | 0.0857 | 7.1619 | 0.3752 | 23.0713 | 3.9306 | 1.1386 | 1.5021 | 0.3718 | 0.1233 | 24.7524 | 267.064 | 0.9032 | 1.10 | 0.6219 | 0.4122 | 0.2562 | 0.4119 | 68.8489 | 535.0245 | 2.0293 | 11.21 | 0.1882 | 4.0923 | 0.0640 | 2.0952 | 9.2721 | 0.0584 | 0.0484 | 0.0148 | 82.8602 | 0.4958 | 0.0157 | 0.0039 | 3.1745 | 0.0584 | 0.0484 | 0.0148 | 82.8602 | 1 |
| 3 | 2988.72 | 2479.90 | 2199.0333 | 909.7926 | 1.3204 | 104.2367 | 0.1217 | 1.4882 | -0.0124 | -0.0033 | 0.9629 | 201.8482 | 9.6052 | 422.2894 | 9.6924 | 0.9687 | 192.1557 | 12.4782 | 1.4011 | -5468.25 | 2648.25 | -4515.00 | -1657.25 | 1.3137 | 2.0038 | 7.3145 | 62.9333 | 2.6444 | 0.2071 | 3.3813 | 84.9105 | 8.6789 | 50.5100 | 64.1125 | 49.4900 | 65.1951 | 86.6867 | 117.0442 | 76.90 | 1.279 | 363.0273 | 9.9305 | 131.8027 | 733.8778 | 1.3027 | 142.5427 | 637.3727 | 189.9079 | 4.486 | 4.748 | 2936.0 | 0.9139 | 0.9447 | 4.5873 | 24.3791 | 361.4582 | 10.2112 | 116.1818 | 13.5597 | 15.6209 | 23.4736 | 710.4043 | 0.9761 | 147.6545 | 625.2945 | 70.2289 | 160.3210 | 464.9735 | 0.0 | -0.0555 | -0.0461 | -0.0400 | 0.0400 | 0.0676 | -0.1051 | 0.0028 | 0.0277 | 7.5925 | 0.1302 | NaN | 2.4004 | 0.9904 | 1752.0968 | 0.1958 | 8205.7000 | 0.0697 | -0.0003 | -0.0021 | -0.0001 | 0.0002 | 0.0411 | 0.0177 | -0.0195 | -0.0002 | 0.0000 | -0.0699 | -0.0059 | 0.0003 | 0.0003 | 0.0021 | -0.0483 | -0.1180 | NaN | NaN | NaN | 0.4647 | 0.9564 | 0.0 | 709.0867 | 0.9906 | 58.6635 | 0.6016 | 0.9761 | 6.4935 | 15.55 | 3.132 | 15.61 | 15.59 | 1.3660 | 2.480 | 0.5176 | 3.119 | 0.2838 | 0.7244 | 0.9961 | 2.3802 | 980.4510 | 41.1025 | 127.0 | 118.0 | 123.7 | 47.8000 | 162.4320 | 0.1915 | 5.51 | 0.0030 | 0.1140 | 0.0393 | 0.0613 | 0.0190 | 13.2651 | 9.073 | 15.241 | 1.3029 | 0.0150 | 11.9738 | 0.35 | 0.0699 | NaN | NaN | 859.0 | 355.0 | 3433.0 | 3004.0 | 0.068 | 0.108 | 0.100 | 1.7 | 0.9 | 0.086 | 0.241 | 0.9386 | 0.0356 | 0.2618 | 0.4391 | 0.2618 | 0.8567 | 0.2452 | 0.390 | 16.22 | 0.693 | 14.67 | 22.562 | 0.1786 | 5.69 | 18.20 | 52.571 | 0.139 | 5.92 | 23.60 | 1.264 | 10.63 | 13.56 | 5.92 | 11.382 | 24.320 | 0.3458 | 9.56 | 0.0 | 21.97 | 104.950 | 0.0 | 0.1248 | 0.0463 | 0.1223 | 0.0354 | 0.0708 | 0.0754 | 0.0643 | 0.0932 | 5.5398 | 0.0023 | NaN | 0.0764 | 0.0015 | 152.0885 | 0.0573 | 820.3999 | 0.0152 | 0.0149 | 0.0067 | 0.0040 | NaN | NaN | NaN | 0.0191 | 0.0234 | 0.0 | 94.0954 | 0.0010 | 3.2119 | 0.0406 | 0.0072 | 0.4212 | 0.0513 | 31.830 | 3.1959 | 33.8960 | 37.8477 | 44.3906 | 16.9347 | 50.3631 | 0.0581 | 2.1775 | 0.0007 | 0.0417 | 0.0115 | 0.0172 | 0.0063 | 4.2154 | 2.8960 | 4.0526 | 0.3882 | 0.0049 | 3.9403 | 0.0916 | 0.0245 | NaN | NaN | 415.5048 | 157.0889 | 1572.6896 | 1377.4276 | 0.0285 | 0.0445 | 0.0465 | 0.6305 | 0.3046 | 0.0286 | 0.0824 | 0.3483 | 0.0128 | 0.1004 | 0.1701 | 0.1004 | 0.3465 | 0.0973 | 0.1675 | 5.4440 | 0.2004 | 4.1900 | 6.3329 | 0.0479 | 1.7339 | 4.9660 | 15.7375 | 0.0243 | 1.7317 | 6.6262 | 0.3512 | 3.2699 | 9.4020 | 1.7317 | 3.0672 | 6.6839 | 0.0928 | 3.0229 | 0.0 | 6.3292 | 29.0339 | 8.4026 | 4.8851 | 0.0 | 0.0407 | 0.0198 | 0.0531 | 0.0167 | 0.0224 | 0.0422 | 0.0273 | 0.0484 | 1.8222 | 0.0006 | NaN | 0.0252 | 0.0004 | 45.7058 | 0.0188 | 309.8492 | 0.0046 | 0.0049 | 0.0028 | 0.0034 | 0.0024 | 0.0014 | NaN | NaN | NaN | 0.0038 | 0.0068 | 0.0 | 32.4228 | 0.0003 | 1.1135 | 0.0132 | 0.0023 | 0.1348 | 0.0155 | 13.3972 | 1.1907 | 5.6363 | 3.9482 | 4.9881 | 2.1737 | 17.8537 | 14.5054 | 5.2860 | 2.4643 | 7.6602 | 317.7362 | 0.0000 | 1.9689 | 6.5718 | 94.4594 | 3.6091 | 13.4420 | 1.5441 | 6.2313 | 2.8049 | 4.9898 | 15.7089 | 13.4051 | 76.0354 | 181.2641 | 5.1760 | 5.3899 | 1.3671 | 2.7013 | 34.0336 | 41.5236 | 7.1274 | 1.1054 | 0.4097 | 0.5183 | 0.6849 | 0.5290 | 1.3141 | 0.2829 | 0.3332 | 4.4680 | 6.9785 | 11.1303 | 3.0744 | 13.7105 | 3.9918 | 2.8555 | 27.6824 | 3.0301 | 24.2831 | 6.5291 | 12.3786 | 9.1494 | 100.0021 | 37.8979 | 48.4887 | 3.4234 | 35.4323 | 6.4746 | 0.0 | 3.5135 | 149.4399 | 225.0169 | 100.4883 | 305.7500 | 88.5553 | 104.6660 | 71.7583 | 0.0000 | 336.7660 | 72.9635 | 1.7670 | NaN | 3.1817 | 0.1488 | 8.6804 | 29.2542 | 9.9979 | 0.0000 | 711.6418 | 113.5593 | 0.0000 | NaN | NaN | NaN | 4.1200 | 2.4416 | 0.0 | 13.2699 | 0.0977 | 5.4751 | 6.7553 | 0.7404 | 6.4865 | 2.1565 | 3.2465 | 7.7754 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.1650 | 372.822 | 72.442 | 1.8804 | 131.68 | 39.33 | 0.6812 | 56.9303 | 17.4781 | 161.4081 | 35.3198 | 54.2917 | 1.1613 | 0.7288 | 0.2710 | 62.7572 | 268.228 | 0.6511 | 7.32 | 0.1630 | 3.5611 | 0.0670 | 2.7290 | 25.0363 | 530.5682 | 2.0253 | 9.33 | 0.1738 | 2.8971 | 0.0525 | 1.7585 | 8.5831 | 0.0202 | 0.0149 | 0.0044 | 73.8432 | 0.4990 | 0.0103 | 0.0025 | 2.0544 | 0.0202 | 0.0149 | 0.0044 | 73.8432 | -1 |
| 4 | 3032.24 | 2502.87 | 2233.3667 | 1326.5200 | 1.5334 | 100.3967 | 0.1235 | 1.5031 | -0.0031 | -0.0072 | 0.9569 | 201.9424 | 10.5661 | 420.5925 | 10.3387 | 0.9735 | 191.6037 | 12.4735 | 1.3888 | -5476.25 | 2635.25 | -3987.50 | 117.00 | 1.2887 | 1.9912 | 7.2748 | 62.8333 | 3.1556 | 0.2696 | 3.2728 | 86.3269 | 8.7677 | 50.2480 | 64.1511 | 49.7520 | 66.1542 | 86.1468 | 121.4364 | 76.39 | 2.209 | 353.3400 | 10.4091 | 176.3136 | 789.7523 | 1.0341 | 138.0882 | 667.7418 | 233.5491 | 4.624 | 4.894 | 2865.0 | 0.9298 | 0.9449 | 4.6414 | -12.2945 | 355.0809 | 9.7948 | 144.0191 | 21.9782 | 32.2945 | 44.1498 | 745.6025 | 0.9256 | 146.6636 | 645.7636 | 65.8417 | NaN | NaN | 0.0 | -0.0534 | 0.0183 | -0.0167 | -0.0449 | 0.0034 | -0.0178 | -0.0123 | -0.0048 | 7.5017 | 0.1342 | NaN | 2.4530 | 0.9902 | 1828.3846 | 0.1829 | 9014.4600 | 0.0448 | -0.0077 | -0.0001 | -0.0001 | -0.0001 | 0.2189 | -0.6704 | -0.0167 | 0.0004 | -0.0003 | 0.0696 | -0.0045 | 0.0002 | 0.0078 | 0.0000 | -0.0799 | -0.2038 | NaN | NaN | NaN | NaN | 0.9424 | 0.0 | 796.5950 | 0.9908 | 58.3858 | 0.5913 | 0.9628 | 6.3551 | 15.75 | 3.148 | 15.73 | 15.71 | 0.9460 | 3.027 | 0.5328 | 3.299 | -0.5677 | 0.7780 | 1.0010 | 2.3715 | 993.1274 | 38.1448 | 119.0 | 143.2 | 123.1 | 48.8000 | 296.3030 | 0.3744 | 3.64 | 0.0041 | 0.0634 | 0.0451 | 0.0623 | 0.0240 | 14.2354 | 9.005 | 12.506 | 0.4434 | 0.0126 | 13.9047 | 0.43 | 0.0538 | NaN | NaN | 699.0 | 283.0 | 1747.0 | 1443.0 | 0.147 | 0.040 | 0.113 | 3.9 | 0.8 | 0.101 | 0.499 | 0.5760 | 0.0631 | 0.3053 | 0.5830 | 0.3053 | 0.8285 | 0.1308 | 0.922 | 15.24 | 0.282 | 10.85 | 37.715 | 0.1189 | 3.98 | 25.54 | 72.149 | 0.250 | 5.52 | 15.76 | 0.519 | 10.71 | 19.77 | 5.52 | 8.446 | 33.832 | 0.3951 | 9.09 | 0.0 | 19.77 | 92.307 | 0.0 | 0.0915 | 0.0506 | 0.0769 | 0.1079 | 0.0797 | 0.1047 | 0.0924 | 0.1015 | 4.1338 | 0.0030 | NaN | 0.0802 | 0.0004 | 69.1510 | 0.1970 | 1406.4004 | 0.0227 | 0.0272 | 0.0067 | 0.0031 | NaN | NaN | NaN | NaN | 0.0240 | 0.0 | 149.2172 | 0.0006 | 2.5775 | 0.0177 | 0.0214 | 0.4051 | 0.0488 | 19.862 | 3.6163 | 34.1250 | 55.9626 | 53.0876 | 17.4864 | 88.7672 | 0.1092 | 1.0929 | 0.0013 | 0.0257 | 0.0116 | 0.0163 | 0.0080 | 4.4239 | 3.2376 | 3.6536 | 0.1293 | 0.0040 | 4.3474 | 0.1275 | 0.0181 | NaN | NaN | 319.1252 | 128.0296 | 799.5884 | 628.3083 | 0.0755 | 0.0181 | 0.0476 | 1.3500 | 0.2698 | 0.0320 | 0.1541 | 0.2155 | 0.0310 | 0.1354 | 0.2194 | 0.1354 | 0.3072 | 0.0582 | 0.3574 | 4.8956 | 0.0766 | 2.9130 | 11.0583 | 0.0327 | 1.1229 | 7.3296 | 23.1160 | 0.0822 | 1.6216 | 4.7279 | 0.1773 | 3.1550 | 9.7777 | 1.6216 | 2.5923 | 10.5352 | 0.1301 | 3.0939 | 0.0 | 6.3767 | 32.0537 | NaN | NaN | 0.0 | 0.0246 | 0.0221 | 0.0329 | 0.0522 | 0.0256 | 0.0545 | 0.0476 | 0.0463 | 1.5530 | 0.0010 | NaN | 0.0286 | 0.0001 | 21.0312 | 0.0573 | 494.7368 | 0.0063 | 0.0077 | 0.0052 | 0.0027 | 0.0025 | 0.0012 | NaN | NaN | NaN | NaN | 0.0089 | 0.0 | 57.2692 | 0.0002 | 0.8495 | 0.0065 | 0.0077 | 0.1356 | 0.0165 | 7.1493 | 1.1704 | 5.3823 | 4.7226 | 4.9184 | 2.1850 | 22.3369 | 24.4142 | 3.6256 | 3.3208 | 4.2178 | 0.0000 | 866.0295 | 2.5046 | 7.0492 | 85.2255 | 2.9734 | 4.2892 | 1.2943 | 7.2570 | 3.4473 | 3.8754 | 12.7642 | 10.7390 | 43.8119 | 0.0000 | 11.4064 | 2.0088 | 1.5533 | 6.2069 | 25.3521 | 37.4691 | 15.2470 | 0.6672 | 0.7198 | 0.6076 | 0.9088 | 0.6136 | 1.2524 | 0.1518 | 0.7592 | 4.3131 | 2.7092 | 6.1538 | 4.7756 | 11.4945 | 2.8822 | 3.8248 | 30.8924 | 5.3863 | 44.8980 | 4.4384 | 5.2987 | 7.4365 | 89.9529 | 17.0927 | 19.1303 | 4.5375 | 42.6838 | 6.1979 | 0.0 | 3.0615 | 140.1953 | 171.4486 | 276.8810 | 461.8619 | 240.1781 | 0.0000 | 587.3773 | 748.1781 | 0.0000 | 55.1057 | 2.2358 | NaN | 3.2712 | 0.0372 | 3.7821 | 107.6905 | 15.6016 | 293.1396 | 0.0000 | 148.0663 | 0.0000 | NaN | NaN | NaN | NaN | 2.5512 | 0.0 | 18.7319 | 0.0616 | 4.4146 | 2.9954 | 2.2181 | 6.3745 | 2.0579 | 1.9999 | 9.4805 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.4636 | 399.914 | 79.156 | 1.0388 | 19.63 | 1.98 | 0.4287 | 9.7608 | 0.8311 | 70.9706 | 4.9086 | 2.5014 | 0.9778 | 0.2156 | 0.0461 | 22.0500 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 532.0155 | 2.0275 | 8.83 | 0.2224 | 3.1776 | 0.0706 | 1.6597 | 10.9698 | NaN | NaN | NaN | NaN | 0.4800 | 0.4766 | 0.1045 | 99.3032 | 0.0202 | 0.0149 | 0.0044 | 73.8432 | -1 |
#attribute type( except the target all the variables are float)
com.dtypes
0 float64 1 float64 2 float64 3 float64 4 float64 6 float64 7 float64 8 float64 9 float64 10 float64 11 float64 12 float64 14 float64 15 float64 16 float64 17 float64 18 float64 19 float64 20 float64 21 float64 22 float64 23 float64 24 float64 25 float64 26 float64 27 float64 28 float64 29 float64 30 float64 31 float64 32 float64 33 float64 34 float64 35 float64 36 float64 37 float64 38 float64 39 float64 40 float64 41 float64 43 float64 44 float64 45 float64 46 float64 47 float64 48 float64 50 float64 51 float64 53 float64 54 float64 55 float64 56 float64 57 float64 58 float64 59 float64 60 float64 61 float64 62 float64 63 float64 64 float64 65 float64 66 float64 67 float64 68 float64 70 float64 71 float64 72 float64 73 float64 74 float64 75 float64 76 float64 77 float64 78 float64 79 float64 80 float64 81 float64 82 float64 83 float64 84 float64 85 float64 86 float64 87 float64 88 float64 89 float64 90 float64 91 float64 92 float64 93 float64 94 float64 95 float64 96 float64 98 float64 99 float64 100 float64 101 float64 102 float64 103 float64 104 float64 105 float64 106 float64 107 float64 108 float64 109 float64 110 float64 111 float64 112 float64 113 float64 114 float64 115 float64 116 float64 117 float64 118 float64 119 float64 120 float64 121 float64 122 float64 123 float64 124 float64 125 float64 126 float64 127 float64 128 float64 129 float64 130 float64 131 float64 132 float64 133 float64 134 float64 135 float64 136 float64 137 float64 138 float64 139 float64 140 float64 142 float64 143 float64 144 float64 145 float64 146 float64 147 float64 148 float64 150 float64 151 float64 152 float64 153 float64 154 float64 155 float64 156 float64 157 float64 158 float64 159 float64 160 float64 161 float64 162 float64 163 float64 164 float64 165 float64 166 float64 167 float64 168 float64 169 float64 170 float64 171 float64 172 float64 173 float64 174 float64 175 float64 176 float64 177 float64 180 float64 181 float64 182 float64 183 float64 184 float64 185 float64 187 float64 188 float64 195 float64 196 float64 197 float64 198 float64 199 float64 200 float64 201 float64 202 float64 203 float64 204 float64 205 float64 206 float64 207 float64 208 float64 209 float64 210 float64 211 float64 212 float64 213 float64 214 float64 215 float64 216 float64 217 float64 218 float64 219 float64 220 float64 221 float64 222 float64 223 float64 224 float64 225 float64 227 float64 228 float64 238 float64 239 float64 244 float64 245 float64 246 float64 247 float64 248 float64 249 float64 250 float64 251 float64 252 float64 253 float64 254 float64 255 float64 267 float64 268 float64 269 float64 270 float64 271 float64 272 float64 273 float64 274 float64 275 float64 277 float64 278 float64 279 float64 280 float64 281 float64 282 float64 283 float64 285 float64 286 float64 287 float64 288 float64 289 float64 290 float64 291 float64 292 float64 293 float64 294 float64 295 float64 296 float64 297 float64 298 float64 299 float64 300 float64 301 float64 302 float64 303 float64 304 float64 305 float64 306 float64 307 float64 308 float64 309 float64 310 float64 311 float64 312 float64 316 float64 317 float64 318 float64 319 float64 320 float64 321 float64 323 float64 324 float64 331 float64 332 float64 333 float64 334 float64 335 float64 336 float64 337 float64 338 float64 339 float64 340 float64 341 float64 342 float64 343 float64 344 float64 345 float64 346 float64 347 float64 348 float64 349 float64 350 float64 351 float64 352 float64 353 float64 354 float64 355 float64 356 float64 357 float64 358 float64 359 float64 360 float64 361 float64 362 float64 363 float64 365 float64 366 float64 367 float64 368 float64 376 float64 377 float64 382 float64 383 float64 384 float64 385 float64 386 float64 387 float64 388 float64 389 float64 390 float64 391 float64 392 float64 393 float64 405 float64 406 float64 407 float64 408 float64 409 float64 410 float64 411 float64 412 float64 413 float64 415 float64 416 float64 417 float64 418 float64 419 float64 420 float64 421 float64 423 float64 424 float64 425 float64 426 float64 427 float64 428 float64 429 float64 430 float64 431 float64 432 float64 433 float64 434 float64 435 float64 436 float64 437 float64 438 float64 439 float64 440 float64 441 float64 442 float64 443 float64 444 float64 445 float64 446 float64 447 float64 448 float64 452 float64 453 float64 454 float64 455 float64 456 float64 457 float64 459 float64 460 float64 467 float64 468 float64 469 float64 470 float64 471 float64 472 float64 473 float64 474 float64 475 float64 476 float64 477 float64 478 float64 479 float64 480 float64 482 float64 483 float64 484 float64 485 float64 486 float64 487 float64 488 float64 489 float64 490 float64 491 float64 492 float64 493 float64 494 float64 495 float64 496 float64 497 float64 499 float64 500 float64 510 float64 511 float64 516 float64 517 float64 518 float64 519 float64 520 float64 521 float64 522 float64 523 float64 524 float64 525 float64 526 float64 527 float64 539 float64 540 float64 541 float64 542 float64 543 float64 544 float64 545 float64 546 float64 547 float64 548 float64 549 float64 550 float64 551 float64 552 float64 553 float64 554 float64 555 float64 556 float64 557 float64 558 float64 559 float64 560 float64 561 float64 562 float64 563 float64 564 float64 565 float64 566 float64 567 float64 568 float64 569 float64 570 float64 571 float64 572 float64 573 float64 574 float64 575 float64 576 float64 577 float64 578 float64 579 float64 580 float64 581 float64 582 float64 583 float64 584 float64 585 float64 586 float64 587 float64 588 float64 589 float64 Pass/Fail int64 dtype: object
#after dropping the constant signal
row,column=com.shape
print('After dropping the constant signals the dataset contains', row, 'rows and', column, 'columns')
After dropping the constant signals the dataset contains 1585 rows and 475 columns
#checking for distribution of the target class shows that the data set is highly imbalanced
sg['Pass/Fail'].value_counts(normalize=True)
-1 0.933631 1 0.066369 Name: Pass/Fail, dtype: float64
#label encoding the target class
com['Pass/Fail']=com['Pass/Fail'].replace([-1,1],[0,1])
# checking how many rows have missing values shows that all rows have atleast one missing value
rows=com.isnull().any(axis = 1).sum()
print( 'All the' ,rows,'rows have atleast one missing value')
All the 1585 rows have atleast one missing value
#replacing the NaN/NA with zero and considering it as no signal
com.fillna(0,inplace=True)
#checking for correlation
plt.figure(figsize=(20,18))
corr=com.corr()
sns.heatmap(abs(corr>0.7),cmap="Greens");
#making a copy of the dataset and dropping the target class
com1=com.copy()
com1.drop(['Pass/Fail'],axis=1,inplace=True)
# Create correlation matrix
corr_matrix = com1.corr().abs()
# Select upper triangle of correlation matrix
upper = corr_matrix.where(np.triu(np.ones(corr_matrix.shape), k=1).astype(np.bool))
# Find features with correlation greater than 0.70
to_drop = [column for column in upper.columns if any(upper[column] > 0.70)]
# Drop features
com1.drop(to_drop, axis=1, inplace=True)
row,column=com1.shape
print('After dropping the correlated variables the dataset contains', row, 'rows and', column, 'columns')
After dropping the correlated variables the dataset contains 1585 rows and 193 columns
#Boxplot to check for outliers
plt.figure(figsize=(50, 50))
col = 1
for i in com1.columns:
plt.subplot(20,10, col)
sns.boxplot(com1[i],color='blue')
col += 1
Majority of the attributes have outliers, will be replacing them with the median
#find the outliers and replace them by median
for i in com1.columns:
q1 = com1[i].quantile(0.25)
q3 = com1[i].quantile(0.75)
iqr = q3 - q1
low = q1 - 1.5 * iqr
high = q3 + 1.5 * iqr
com1.loc[(com1[i] < low) | (com1[i] > high), i] = com1[i].median()
#After treating the outlier values
plt.figure(figsize=(50, 50))
col = 1
for i in com1.columns:
plt.subplot(20,10, col)
sns.boxplot(com1[i],color='blue')
col += 1
#plotting histogram to check for the frequency of values within a variable
com1.hist(bins = 30, figsize = (40, 40), color = 'blue')
plt.show()
Some variables have still have 0 as a constant signal, will be drooping them after scaling with z-score
#density plot to check for the distribution of the variables
plt.figure(figsize=(40, 40))
col = 1
for i in com1.columns:
plt.subplot(20, 10, col)
sns.distplot(com1[i], color = 'b')
col += 1
Majority of the variable seem to have a normal distribution
#scaling with z-score
comScaled= com1.apply(zscore)
comScaled.describe().T
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| 0 | 1585.0 | -8.888439e-15 | 1.000316 | -2.904359 | -0.677576 | 0.020960 | 0.675554 | 2.929639 |
| 1 | 1585.0 | 3.433111e-15 | 1.000316 | -2.926390 | -0.620535 | 0.037138 | 0.625679 | 2.864734 |
| 2 | 1585.0 | -8.631301e-15 | 1.000316 | -2.886075 | -0.695435 | -0.013194 | 0.640689 | 2.801684 |
| 3 | 1585.0 | -5.774561e-16 | 1.000316 | -1.893417 | -0.782209 | -0.178455 | 0.625548 | 3.011310 |
| 4 | 1585.0 | -2.338823e-16 | 1.000316 | -1.844692 | -0.834187 | 0.044671 | 0.582986 | 2.854370 |
| 8 | 1585.0 | 1.101117e-16 | 1.000316 | -2.924112 | -0.724213 | -0.031550 | 0.731765 | 2.661526 |
| 9 | 1585.0 | -2.591688e-17 | 1.000316 | -2.734072 | -0.674986 | 0.008976 | 0.656940 | 2.701626 |
| 10 | 1585.0 | -7.494881e-18 | 1.000316 | -2.762031 | -0.653986 | 0.032634 | 0.671071 | 2.755024 |
| 11 | 1585.0 | 1.391806e-14 | 1.000316 | -2.872244 | -0.709347 | 0.116684 | 0.714469 | 2.192630 |
| 14 | 1585.0 | -3.479516e-16 | 1.000316 | -2.496898 | -0.678264 | 0.008569 | 0.684129 | 2.754786 |
| 15 | 1585.0 | -3.033816e-15 | 1.000316 | -2.734990 | -0.750380 | -0.027389 | 0.759829 | 2.637318 |
| 16 | 1585.0 | -2.738924e-15 | 1.000316 | -2.772183 | -0.655619 | -0.023929 | 0.615534 | 2.679639 |
| 19 | 1585.0 | -6.498482e-15 | 1.000316 | -2.630849 | -0.585356 | -0.034525 | 0.686764 | 2.539703 |
| 20 | 1585.0 | -1.512250e-14 | 1.000316 | -2.681456 | -0.664070 | 0.010892 | 0.670855 | 2.725739 |
| 21 | 1585.0 | -2.577854e-16 | 1.000316 | -2.421282 | -0.468853 | 0.322962 | 0.661827 | 2.651531 |
| 23 | 1585.0 | 1.718569e-16 | 1.000316 | -2.784988 | -0.469389 | 0.036146 | 0.421677 | 2.782795 |
| 24 | 1585.0 | -2.484868e-16 | 1.000316 | -2.982086 | -0.511400 | 0.069580 | 0.797731 | 3.020519 |
| 25 | 1585.0 | 4.468910e-16 | 1.000316 | -2.516015 | 0.189865 | 0.450554 | 0.613012 | 1.206175 |
| 28 | 1585.0 | 3.210051e-15 | 1.000316 | -2.729185 | -0.612880 | -0.098182 | 0.800129 | 2.430537 |
| 29 | 1585.0 | -3.144698e-16 | 1.000316 | -2.845681 | -0.734662 | 0.007096 | 0.691598 | 2.603120 |
| 31 | 1585.0 | -3.968785e-15 | 1.000316 | -3.740614 | -0.569690 | 0.328471 | 0.452585 | 4.919370 |
| 32 | 1585.0 | 8.832708e-15 | 1.000316 | -2.153454 | -0.694692 | 0.011708 | 0.661211 | 2.641406 |
| 33 | 1585.0 | -4.544560e-16 | 1.000316 | -2.072355 | -0.647322 | -0.107606 | 0.557447 | 2.773153 |
| 36 | 1585.0 | -5.487654e-15 | 1.000316 | -3.014239 | -0.734913 | 0.030086 | 0.741212 | 2.780558 |
| 40 | 1585.0 | 1.703089e-15 | 1.000316 | -4.522295 | -0.185596 | -0.064749 | 0.601639 | 2.652593 |
| 41 | 1585.0 | 9.551420e-16 | 1.000316 | -2.979843 | -0.548548 | 0.060201 | 0.452464 | 3.202011 |
| 43 | 1585.0 | 5.326969e-15 | 1.000316 | -2.074364 | -0.764196 | -0.290161 | 0.852238 | 3.211320 |
| 44 | 1585.0 | -1.068518e-14 | 1.000316 | -2.872980 | -0.656880 | -0.025240 | 0.715563 | 2.720276 |
| 45 | 1585.0 | 6.682352e-17 | 1.000316 | -2.863606 | -0.784207 | -0.024164 | 0.722449 | 3.000256 |
| 47 | 1585.0 | -7.041336e-16 | 1.000316 | -3.599632 | -1.018141 | 0.379131 | 0.852819 | 1.763187 |
| 53 | 1585.0 | -1.356111e-14 | 1.000316 | -2.718269 | -0.704410 | -0.022297 | 0.627335 | 2.738639 |
| 59 | 1585.0 | -3.642372e-17 | 1.000316 | -3.095865 | -0.585831 | 0.094912 | 0.481649 | 3.445363 |
| 60 | 1585.0 | -8.309371e-15 | 1.000316 | -2.019516 | -0.775202 | -0.245307 | 0.768618 | 2.968363 |
| 62 | 1585.0 | -2.461053e-15 | 1.000316 | -2.826515 | -0.674821 | -0.061248 | 0.674969 | 2.747456 |
| 63 | 1585.0 | -1.265724e-16 | 1.000316 | -2.815937 | -0.697842 | -0.009364 | 0.624403 | 2.944701 |
| 64 | 1585.0 | 1.678293e-16 | 1.000316 | -2.641462 | -0.665341 | 0.009330 | 0.607763 | 2.857843 |
| 67 | 1585.0 | -1.600350e-15 | 1.000316 | -3.388588 | -0.756136 | 0.020949 | 0.687596 | 3.503777 |
| 71 | 1585.0 | -2.284888e-16 | 1.000316 | -2.715569 | -0.574529 | 0.164629 | 0.616625 | 2.875594 |
| 72 | 1585.0 | 1.793168e-15 | 1.000316 | -1.751047 | -0.971501 | -0.971501 | 1.023499 | 1.327558 |
| 74 | 0.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 75 | 1585.0 | 8.025477e-17 | 1.000316 | -2.834398 | -0.661983 | -0.007509 | 0.657965 | 2.731383 |
| 76 | 1585.0 | 1.962328e-16 | 1.000316 | -2.897060 | -0.692488 | 0.007677 | 0.717303 | 2.820951 |
| 77 | 1585.0 | -2.997952e-17 | 1.000316 | -2.555301 | -0.734580 | -0.021817 | 0.542748 | 2.635166 |
| 78 | 1585.0 | 4.806705e-16 | 1.000316 | -2.551167 | -0.731763 | 0.068050 | 0.580111 | 2.537727 |
| 79 | 1585.0 | 1.501778e-16 | 1.000316 | -2.699525 | -0.613193 | 0.015078 | 0.584077 | 2.860076 |
| 80 | 1585.0 | 3.162560e-17 | 1.000316 | -2.219007 | -0.487344 | 0.261721 | 0.601004 | 2.378932 |
| 81 | 1585.0 | -3.986296e-16 | 1.000316 | -2.897057 | -0.551720 | -0.067761 | 0.583721 | 2.835989 |
| 82 | 1585.0 | -7.923910e-17 | 1.000316 | -2.708908 | -0.680740 | 0.106510 | 0.716962 | 2.891906 |
| 83 | 1585.0 | 1.620575e-15 | 1.000316 | -2.794234 | -0.687269 | 0.027967 | 0.703256 | 2.754617 |
| 84 | 1585.0 | 1.331427e-15 | 1.000316 | -2.604483 | -0.673220 | -0.029466 | 0.635055 | 2.670149 |
| 85 | 0.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 86 | 1585.0 | -4.702276e-15 | 1.000316 | -2.948286 | -0.739383 | 0.034026 | 0.748843 | 2.629633 |
| 87 | 1585.0 | -6.054323e-15 | 1.000316 | -2.876334 | -0.477657 | 0.460492 | 0.705690 | 1.110800 |
| 88 | 1585.0 | 6.987751e-16 | 1.000316 | -2.680166 | -0.627131 | -0.014610 | 0.657315 | 2.628156 |
| 89 | 1585.0 | 1.045081e-15 | 1.000316 | -3.380507 | -0.841277 | 0.159026 | 0.702780 | 3.067598 |
| 90 | 1585.0 | 8.707371e-16 | 1.000316 | -2.909511 | -0.664558 | -0.039004 | 0.655154 | 2.812032 |
| 91 | 1585.0 | -2.633715e-17 | 1.000316 | -2.715460 | -0.603171 | -0.071476 | 0.584811 | 2.656535 |
| 92 | 1585.0 | 3.768454e-17 | 1.000316 | -2.766864 | -0.612168 | -0.028605 | 0.644738 | 2.889213 |
| 93 | 1585.0 | 7.564927e-17 | 1.000316 | -2.978392 | -0.640780 | -0.028549 | 0.583683 | 2.809980 |
| 94 | 1585.0 | -4.499730e-16 | 1.000316 | -2.843189 | -0.563748 | 0.196065 | 0.196065 | 3.235320 |
| 95 | 1585.0 | -1.683616e-15 | 1.000316 | -1.921436 | -0.685447 | -0.685447 | 0.550541 | 1.786530 |
| 99 | 1585.0 | -1.511234e-17 | 1.000316 | -2.712800 | -0.640822 | 0.013117 | 0.645961 | 2.802318 |
| 100 | 1585.0 | -2.351782e-17 | 1.000316 | -2.757913 | -0.635755 | 0.071631 | 0.425324 | 2.901175 |
| 102 | 1585.0 | -2.073350e-17 | 1.000316 | -2.750931 | -0.649565 | 0.040620 | 0.597287 | 2.840387 |
| 103 | 1585.0 | 2.373145e-16 | 1.000316 | -2.772177 | -0.718850 | -0.095518 | 0.564480 | 2.654474 |
| 107 | 1585.0 | -1.414921e-16 | 1.000316 | -2.816884 | -0.636667 | -0.016233 | 0.628845 | 2.751077 |
| 108 | 1585.0 | -2.101369e-19 | 1.000316 | -2.656532 | -0.652724 | 0.048878 | 0.640055 | 2.781221 |
| 109 | 1585.0 | 2.636657e-15 | 1.000316 | -0.727914 | -0.727914 | -0.727914 | 1.371924 | 1.382646 |
| 112 | 1585.0 | 3.097137e-15 | 1.000316 | -1.080001 | -1.080001 | 0.892602 | 0.929012 | 1.037373 |
| 113 | 1585.0 | 1.013952e-14 | 1.000316 | -2.654192 | -0.631054 | 0.052960 | 0.611731 | 2.586700 |
| 114 | 0.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 115 | 1585.0 | -2.133099e-15 | 1.000316 | -2.726284 | -0.579266 | 0.036880 | 0.639811 | 2.588790 |
| 116 | 1585.0 | -1.198638e-13 | 1.000316 | -3.148536 | -0.393556 | 0.041441 | 0.621437 | 2.796422 |
| 117 | 1585.0 | -7.755197e-15 | 1.000316 | -2.821470 | -0.580452 | -0.013996 | 0.663836 | 2.789795 |
| 118 | 1585.0 | -1.091769e-14 | 1.000316 | -2.804230 | -0.618275 | 0.025533 | 0.669342 | 2.675629 |
| 119 | 1585.0 | -5.178227e-15 | 1.000316 | -2.564476 | -0.883932 | -0.228805 | 1.038724 | 1.665367 |
| 120 | 1585.0 | 9.401242e-15 | 1.000316 | -2.829593 | -0.653474 | 0.015093 | 0.644332 | 2.780031 |
| 121 | 1585.0 | -3.739735e-15 | 1.000316 | -2.811582 | -0.635360 | -0.091305 | 0.670372 | 2.520160 |
| 122 | 1585.0 | -9.119939e-17 | 1.000316 | -2.573631 | -0.773665 | 0.015150 | 0.607649 | 2.665429 |
| 126 | 1585.0 | -2.870294e-15 | 1.000316 | -2.008862 | -0.780919 | 0.055128 | 0.666487 | 3.111922 |
| 129 | 1585.0 | 8.573583e-16 | 1.000316 | -3.346141 | -0.437869 | 0.084513 | 0.382927 | 2.546426 |
| 130 | 1585.0 | -1.369436e-16 | 1.000316 | -3.051506 | -0.716604 | 0.154313 | 0.843321 | 1.771550 |
| 134 | 1585.0 | 1.193490e-15 | 1.000316 | -2.285426 | -0.723972 | -0.125734 | 0.587844 | 2.624822 |
| 135 | 1585.0 | 1.032122e-16 | 1.000316 | -2.161809 | -0.766030 | -0.004697 | 0.672044 | 3.040638 |
| 136 | 1585.0 | 1.825389e-16 | 1.000316 | -2.535792 | -0.882465 | -0.059484 | 0.789273 | 2.910245 |
| 137 | 1585.0 | -1.924153e-16 | 1.000316 | -2.336001 | -0.784630 | -0.087471 | 0.743758 | 3.049746 |
| 138 | 1585.0 | -5.911850e-17 | 1.000316 | -2.836199 | -0.640083 | -0.157170 | 0.498201 | 2.797810 |
| 139 | 1585.0 | 5.757750e-16 | 1.000316 | -2.226109 | -0.734639 | -0.162187 | 0.539337 | 3.110651 |
| 142 | 1585.0 | -1.959876e-16 | 1.000316 | -2.622640 | -0.677654 | -0.025478 | 0.603612 | 2.744827 |
| 143 | 1585.0 | -3.058192e-16 | 1.000316 | -2.689759 | -0.745006 | -0.189363 | 0.644103 | 2.774070 |
| 144 | 1585.0 | -4.776673e-16 | 1.000316 | -2.338285 | -0.702773 | 0.037177 | 0.545298 | 3.149414 |
| 145 | 1585.0 | 2.264225e-17 | 1.000316 | -2.394245 | -0.715257 | -0.048692 | 0.586431 | 2.950849 |
| 146 | 1585.0 | 1.225752e-16 | 1.000316 | -2.233928 | -0.681340 | -0.082372 | 0.532359 | 3.046448 |
| 150 | 1585.0 | 2.309754e-17 | 1.000316 | -2.435885 | -0.748811 | -0.184184 | 0.554407 | 2.841657 |
| 151 | 1585.0 | -9.625669e-16 | 1.000316 | -2.664338 | -0.742425 | -0.057418 | 0.669587 | 2.907785 |
| 153 | 1585.0 | -4.280488e-16 | 1.000316 | -2.371380 | -0.850323 | -0.058539 | 0.712408 | 2.921067 |
| 155 | 1585.0 | 1.595079e-15 | 1.000316 | -2.679038 | -0.782886 | -0.150835 | 0.481216 | 2.535381 |
| 156 | 1585.0 | -5.701713e-17 | 1.000316 | -2.040688 | -0.742010 | -0.102979 | 0.664890 | 3.154022 |
| 157 | 0.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 158 | 0.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 159 | 1585.0 | 2.658932e-16 | 1.000316 | -1.982776 | -0.722193 | -0.075114 | 0.520076 | 3.508237 |
| 160 | 1585.0 | -2.525144e-17 | 1.000316 | -2.177278 | -0.730355 | -0.028965 | 0.731283 | 3.262172 |
| 161 | 1585.0 | 1.555713e-16 | 1.000316 | -1.305557 | -0.733742 | -0.165878 | 0.459957 | 3.325624 |
| 162 | 1585.0 | 9.599752e-17 | 1.000316 | -0.801513 | -0.676594 | -0.304087 | 0.160422 | 3.394539 |
| 166 | 1585.0 | -1.949019e-17 | 1.000316 | -2.494475 | -0.754684 | -0.085533 | 0.583618 | 2.858729 |
| 167 | 1585.0 | 2.108023e-16 | 1.000316 | -2.242405 | -0.742421 | 0.007571 | 0.507566 | 3.007539 |
| 168 | 1585.0 | 1.176766e-17 | 1.000316 | -3.009047 | -0.774769 | -0.054835 | 0.640274 | 2.949028 |
| 169 | 1585.0 | 2.661733e-16 | 1.000316 | -2.013117 | -0.887886 | 0.049807 | 0.698588 | 3.055491 |
| 170 | 1585.0 | -4.006697e-16 | 1.000316 | -2.461292 | -0.693231 | 0.009664 | 0.718289 | 2.552565 |
| 171 | 1585.0 | -9.568231e-17 | 1.000316 | -2.714510 | -0.744326 | 0.063005 | 0.574067 | 2.939275 |
| 172 | 1585.0 | -4.286792e-16 | 1.000316 | -2.419102 | -0.625292 | 0.040408 | 0.717560 | 2.696048 |
| 173 | 1585.0 | 2.188926e-16 | 1.000316 | -2.415127 | -0.620674 | 0.030164 | 0.661244 | 2.620731 |
| 175 | 1585.0 | 1.753434e-15 | 1.000316 | -2.518782 | -0.737836 | -0.031344 | 0.581316 | 2.681472 |
| 176 | 1585.0 | 1.731177e-16 | 1.000316 | -2.314768 | -0.647774 | -0.003801 | 0.667344 | 2.593830 |
| 177 | 1585.0 | 4.861166e-17 | 1.000316 | -3.051046 | -0.704924 | 0.108822 | 0.605523 | 3.818232 |
| 180 | 1585.0 | 4.815549e-16 | 1.000316 | -2.619476 | -0.678730 | -0.085090 | 0.629235 | 2.658049 |
| 181 | 1585.0 | -6.528251e-17 | 1.000316 | -2.643762 | -0.794243 | -0.087795 | 0.716770 | 2.978384 |
| 182 | 1585.0 | 3.940066e-16 | 1.000316 | -2.916412 | -0.793926 | -0.105627 | 0.725269 | 3.050679 |
| 183 | 1585.0 | -3.971586e-17 | 1.000316 | -3.206689 | -0.820184 | 0.080007 | 0.767242 | 2.494322 |
| 184 | 1585.0 | 9.749255e-17 | 1.000316 | -2.287066 | -0.729647 | -0.071735 | 0.601277 | 2.844649 |
| 188 | 1585.0 | 2.698157e-16 | 1.000316 | -2.031508 | -0.863079 | -0.126109 | 0.687769 | 3.006002 |
| 195 | 1585.0 | -1.028620e-16 | 1.000316 | -2.869082 | -0.301172 | 0.026646 | 0.554798 | 2.813101 |
| 200 | 1585.0 | -1.609648e-16 | 1.000316 | -2.826203 | -0.696034 | 0.003110 | 0.664842 | 2.797349 |
| 201 | 1585.0 | -9.883437e-17 | 1.000316 | -2.401279 | -0.741931 | -0.167669 | 0.593561 | 3.014139 |
| 208 | 1585.0 | 4.129714e-16 | 1.000316 | -2.541621 | -0.608625 | 0.035163 | 0.660716 | 2.641708 |
| 210 | 1585.0 | 2.175617e-16 | 1.000316 | -2.229601 | -0.687593 | -0.085389 | 0.566999 | 3.181113 |
| 211 | 1585.0 | -2.412371e-16 | 1.000316 | -2.651204 | -0.600775 | -0.016834 | 0.573743 | 2.790064 |
| 212 | 1585.0 | 9.393117e-17 | 1.000316 | -2.200943 | -0.670845 | -0.246612 | 0.492220 | 2.970884 |
| 213 | 1585.0 | 2.174128e-16 | 1.000316 | -2.077752 | -0.736454 | 0.017084 | 0.616147 | 2.827781 |
| 214 | 1585.0 | -3.510686e-16 | 1.000316 | -2.919076 | -0.699018 | 0.041001 | 0.673598 | 3.009035 |
| 215 | 1585.0 | 7.655986e-17 | 1.000316 | -2.663944 | -0.604241 | 0.078636 | 0.632320 | 2.614507 |
| 216 | 1585.0 | -4.835249e-16 | 1.000316 | -2.706567 | -0.561343 | 0.077457 | 0.644749 | 2.737534 |
| 217 | 1585.0 | 1.836246e-16 | 1.000316 | -2.222936 | -0.675136 | -0.123818 | 0.537078 | 2.834807 |
| 218 | 1585.0 | 4.164912e-16 | 1.000316 | -2.530927 | -0.710640 | -0.067922 | 0.619963 | 2.735548 |
| 219 | 1585.0 | 1.347678e-16 | 1.000316 | -3.000156 | -0.783070 | 0.023144 | 0.627804 | 3.147220 |
| 221 | 1585.0 | 1.260121e-16 | 1.000316 | -1.830569 | -0.907337 | 0.056626 | 0.726422 | 3.102387 |
| 222 | 1585.0 | -1.232102e-16 | 1.000316 | -1.071661 | -0.646745 | -0.299086 | 0.203087 | 3.409272 |
| 223 | 1585.0 | -3.567150e-16 | 1.000316 | -2.654933 | -0.748741 | -0.006192 | 0.655257 | 2.856388 |
| 225 | 1585.0 | -8.398469e-17 | 1.000316 | -2.410003 | -0.686900 | -0.056862 | 0.623793 | 2.753603 |
| 227 | 1585.0 | 2.293994e-17 | 1.000316 | -2.103706 | -0.714335 | -0.068570 | 0.518488 | 3.179819 |
| 228 | 1585.0 | 3.089887e-16 | 1.000316 | -1.844560 | -0.709932 | -0.132303 | 0.548474 | 3.085916 |
| 238 | 1585.0 | 1.564819e-16 | 1.000316 | -2.441783 | -0.713811 | -0.065821 | 0.654167 | 2.886131 |
| 239 | 1585.0 | 1.314056e-16 | 1.000316 | -2.592366 | -0.740600 | -0.067231 | 0.606139 | 2.794589 |
| 244 | 1585.0 | 9.992287e-15 | 1.000316 | -0.580445 | -0.580445 | -0.580445 | 0.496004 | 3.246931 |
| 247 | 1585.0 | -3.574428e-15 | 1.000316 | -0.852754 | -0.852754 | 0.034267 | 0.451344 | 3.564728 |
| 250 | 1585.0 | 1.321060e-16 | 1.000316 | -2.196318 | -0.761406 | -0.065327 | 0.607115 | 2.828589 |
| 251 | 1585.0 | -2.534951e-16 | 1.000316 | -2.116604 | -0.833936 | 0.128064 | 0.448731 | 3.655400 |
| 253 | 1585.0 | 2.591688e-16 | 1.000316 | -2.317842 | -0.705181 | -0.068604 | 0.589191 | 2.859648 |
| 255 | 1585.0 | -5.703114e-16 | 1.000316 | -2.348376 | -0.799729 | 0.021846 | 0.651861 | 2.562342 |
| 267 | 1585.0 | 4.368045e-16 | 1.000316 | -2.326990 | -0.871408 | -0.005386 | 0.724070 | 2.929094 |
| 268 | 1585.0 | -3.757247e-16 | 1.000316 | -2.596533 | -0.757898 | -0.200510 | 0.683838 | 2.835738 |
| 269 | 1585.0 | 2.308704e-16 | 1.000316 | -2.394794 | -0.737049 | -0.001041 | 0.620663 | 2.887590 |
| 345 | 1585.0 | -1.584817e-15 | 1.000316 | -0.875779 | -0.875779 | -0.875779 | 0.854972 | 3.462820 |
| 418 | 1585.0 | -3.631165e-16 | 1.000316 | -1.116340 | -1.116340 | -0.061924 | 0.707787 | 2.369128 |
| 419 | 1585.0 | 2.258271e-16 | 1.000316 | -0.947360 | -0.947360 | -0.122448 | 0.842814 | 2.120946 |
| 423 | 1585.0 | -2.857861e-16 | 1.000316 | -2.031823 | -0.717350 | -0.175163 | 0.583275 | 2.917538 |
| 432 | 1585.0 | 1.023366e-16 | 1.000316 | -1.252569 | -0.703894 | -0.216823 | 0.418339 | 3.269618 |
| 433 | 1585.0 | 4.101871e-16 | 1.000316 | -0.980412 | -0.924548 | -0.143217 | 0.523429 | 3.138143 |
| 438 | 1585.0 | 7.347785e-16 | 1.000316 | -2.556041 | -0.754777 | -0.140008 | 0.626929 | 2.918052 |
| 460 | 1585.0 | -2.752092e-16 | 1.000316 | -2.690033 | -0.716169 | -0.134792 | 0.582233 | 2.930464 |
| 468 | 1585.0 | -3.253619e-16 | 1.000316 | -1.020606 | -0.839871 | -0.255637 | 0.527731 | 3.013834 |
| 472 | 1585.0 | 9.897446e-17 | 1.000316 | -2.830717 | -0.698688 | 0.036299 | 0.691183 | 2.745285 |
| 476 | 1585.0 | -1.234904e-16 | 1.000316 | -2.192035 | -0.718301 | -0.157392 | 0.472973 | 3.172333 |
| 482 | 1585.0 | 4.034628e-17 | 1.000316 | -1.116932 | -1.116932 | -0.085200 | 0.703802 | 2.425939 |
| 483 | 1585.0 | 1.924854e-16 | 1.000316 | -1.356771 | -0.689694 | -0.118410 | 0.449420 | 3.151215 |
| 484 | 1585.0 | 9.526204e-17 | 1.000316 | -1.156699 | -0.652831 | -0.223232 | 0.414507 | 3.026581 |
| 485 | 1585.0 | 6.178023e-17 | 1.000316 | -1.022856 | -0.706081 | -0.272428 | 0.417019 | 3.223942 |
| 486 | 1585.0 | 4.623711e-16 | 1.000316 | -1.034019 | -1.034019 | -0.196543 | 0.697251 | 2.443430 |
| 487 | 1585.0 | -4.280488e-16 | 1.000316 | -0.899540 | -0.693374 | -0.443277 | 0.549951 | 2.831051 |
| 488 | 1585.0 | 2.927732e-16 | 1.000316 | -1.366065 | -0.877389 | -0.005874 | 0.633058 | 2.566024 |
| 489 | 1585.0 | 7.438844e-17 | 1.000316 | -1.278179 | -0.702064 | -0.116052 | 0.484225 | 2.822029 |
| 499 | 1585.0 | 2.927907e-17 | 1.000316 | -0.809559 | -0.809559 | -0.809559 | 0.841696 | 2.270908 |
| 500 | 1585.0 | 2.249865e-16 | 1.000316 | -0.744957 | -0.744957 | -0.744957 | 0.821397 | 2.347728 |
| 510 | 1585.0 | -3.275333e-16 | 1.000316 | -2.605576 | -0.719978 | -0.082742 | 0.591015 | 3.156036 |
| 511 | 1585.0 | 3.068698e-16 | 1.000316 | -0.834202 | -0.834202 | -0.834202 | 0.844443 | 2.204724 |
| 521 | 0.0 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 542 | 1585.0 | 5.192902e-15 | 1.000316 | -2.161189 | -0.582912 | -0.582912 | 0.811844 | 2.647049 |
| 543 | 1585.0 | 1.806897e-15 | 1.000316 | -2.157012 | -0.374845 | -0.374845 | 0.070696 | 2.966717 |
| 544 | 1585.0 | 6.820342e-16 | 1.000316 | -2.872212 | 0.365666 | 0.365666 | 0.365666 | 2.308393 |
| 546 | 1585.0 | -1.843601e-16 | 1.000316 | -1.740768 | -0.460227 | -0.040573 | 0.724545 | 2.571886 |
| 558 | 1585.0 | 4.722195e-15 | 1.000316 | -2.808815 | -0.635559 | -0.034738 | 0.456219 | 3.271496 |
| 559 | 1585.0 | -1.823287e-16 | 1.000316 | -1.654003 | -0.882388 | -0.154586 | 0.612390 | 2.739099 |
| 562 | 1585.0 | 8.638463e-16 | 1.000316 | -3.134250 | -0.103897 | 0.211403 | 0.269229 | 3.353755 |
| 565 | 1585.0 | -9.243220e-16 | 1.000316 | -1.511814 | -0.268290 | -0.231424 | 0.615080 | 2.515104 |
| 570 | 1585.0 | -1.544296e-15 | 1.000316 | -3.039557 | -0.665248 | -0.121807 | 0.646967 | 2.654809 |
| 571 | 1585.0 | -6.280465e-16 | 1.000316 | -2.825381 | -0.617145 | -0.076842 | 0.740330 | 2.895303 |
| 572 | 1585.0 | -5.550415e-16 | 1.000316 | -2.275769 | -0.678442 | 0.024142 | 0.672681 | 3.110707 |
| 578 | 1585.0 | 1.078702e-16 | 1.000316 | -2.141442 | -0.677800 | -0.677800 | 0.759860 | 3.132865 |
| 582 | 1585.0 | -2.063159e-14 | 1.000316 | -2.650239 | -0.642984 | 0.005514 | 0.684892 | 2.692147 |
| 583 | 1585.0 | -3.192679e-16 | 1.000316 | -2.476380 | -0.718543 | -0.027964 | 0.599835 | 3.079641 |
| 586 | 1585.0 | -1.701058e-16 | 1.000316 | -2.607597 | -0.693461 | -0.006336 | 0.670974 | 2.732351 |
| 587 | 1585.0 | -1.748514e-16 | 1.000316 | -2.340171 | -0.725647 | -0.085930 | 0.553787 | 2.914647 |
| 589 | 1585.0 | 6.092218e-17 | 1.000316 | -1.685870 | -0.708184 | -0.098805 | 0.381081 | 3.162781 |
#combining the data
y=com['Pass/Fail']
comb=pd.concat([comScaled,y],axis=1)
#dropping NaN
comb.dropna(axis=1,inplace=True)
row,column=comb.shape
print('After dropping NaN variables the dataset contains', row, 'rows and', column, 'columns')
After dropping NaN variables the dataset contains 1585 rows and 188 columns
#splitting the dataset into train and validation set
X=comb.iloc[0:1567,:]
val=comb.iloc[1567:,:189]
val=val.drop(['Pass/Fail'],axis=1)
#creating a copy of the train data and seperating the target column and the predictor variables
sg=X.copy()
X=sg.drop(['Pass/Fail'],axis=1)
y=sg['Pass/Fail']
row,column=val.shape
print('The reduced validation dataset contains', row, 'rows and', column, 'columns')
The reduced validation dataset contains 18 rows and 187 columns
row,column=X.shape
print('The reduced training dataset contains', row, 'rows and', column, 'columns')
The reduced training dataset contains 1567 rows and 187 columns
# splitting data training dataset into train and test set for independent attributes
X_train, X_test, Y_train, Y_test =train_test_split(X,y, test_size=.30,random_state=105,stratify=y)
print("Training Fail : {0} ({1:0.2f}%)".format(len(Y_train[Y_train[:] == 1]), (len(Y_train[Y_train[:] == 1])/len(Y_train)) * 100))
print("Training Pass : {0} ({1:0.2f}%)".format(len(Y_train[Y_train[:] == 0]), (len(Y_train[Y_train[:] == 0])/len(Y_train)) * 100))
print("")
print("Test Fail : {0} ({1:0.2f}%)".format(len(Y_test[Y_test[:] == 1]), (len(Y_test[Y_test[:] == 1])/len(Y_test)) * 100))
print("Test Pass : {0} ({1:0.2f}%)".format(len(Y_test[Y_test[:] == 0]), (len(Y_test[Y_test[:] == 0])/len(Y_test)) * 100))
print("")
Training Fail : 73 (6.66%) Training Pass : 1023 (93.34%) Test Fail : 31 (6.58%) Test Pass : 440 (93.42%)
# Initializaing various classification algorithms with normal dataset and choosing the best model based on f1 score for tuning
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=55,shuffle=True)
cv_results = cross_val_score(model, X_train, Y_train, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 12.303221% (14.118950%) KNN: 2.500000% (7.500000%) GNB: 15.396927% (5.431599%) SVM: 13.678232% (11.195580%) DT: 12.994742% (8.385132%) RF: 0.000000% (0.000000%) AB: 13.436008% (16.037941%) GBT: 9.555556% (12.214139%) XGB: 0.000000% (0.000000%) LightGBM: 0.000000% (0.000000%)
# Implementing random under sampling
under= RandomUnderSampler(sampling_strategy=0.5)
X_under, y_under= under.fit_sample(X_train, Y_train)
print("Under Training Fail : {0} ({1:0.2f}%)".format(len(y_under[y_under[:] == 1]), (len(y_under[y_under[:] == 1])/len(y_under)) * 100))
print("under Training Pass : {0} ({1:0.2f}%)".format(len(y_under[y_under[:] == 0]), (len(y_under[y_under[:] == 0])/len(y_under)) * 100))
Under Training Fail : 73 (33.33%) under Training Pass : 146 (66.67%)
# Initializaing various classification algorithms with random under sampler dataset and choosing the best model based on f1 score
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=66,shuffle=True)
cv_results = cross_val_score(model, X_under, y_under, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 40.088467% (14.420686%) KNN: 38.174048% (18.430721%) GNB: 52.030363% (12.660184%) SVM: 40.496337% (15.186011%) DT: 45.997432% (9.507136%) RF: 16.398990% (18.704524%) AB: 39.132784% (18.853274%) GBT: 34.679654% (18.669068%) XGB: 41.849928% (15.285761%) LightGBM: 41.736264% (13.714290%)
# Implementing SMOTE
smt = SMOTE(sampling_strategy=0.5)
X_SMOTE, y_SMOTE = smt.fit_sample(X_train, Y_train)
print("SMOTE Training Fail : {0} ({1:0.2f}%)".format(len(y_SMOTE[y_SMOTE[:] == 1]), (len(y_SMOTE[y_SMOTE[:] == 1])/len(y_SMOTE)) * 100))
print("SMOTE Training Pass : {0} ({1:0.2f}%)".format(len(y_SMOTE[y_SMOTE[:] == 0]), (len(y_SMOTE[y_SMOTE[:] == 0])/len(y_SMOTE)) * 100))
SMOTE Training Fail : 511 (33.31%) SMOTE Training Pass : 1023 (66.69%)
# Initializaing various classification algorithms with Smote dataset and choosing the best model based on f1 score
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=25,shuffle=True)
cv_results = cross_val_score(model, X_SMOTE, y_SMOTE, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 86.378229% (2.156129%) KNN: 53.896911% (1.046391%) GNB: 78.950312% (5.647012%) SVM: 87.427750% (2.899358%) DT: 80.347386% (4.283734%) RF: 96.729228% (1.934272%) AB: 88.000888% (2.018243%) GBT: 95.265767% (1.399911%) XGB: 95.916882% (1.995075%) LightGBM: 96.985307% (0.934066%)
# Implementing random over sampling
over= RandomOverSampler(sampling_strategy=0.5)
X_over, y_over= over.fit_sample(X_train, Y_train)
print("over Training Fail : {0} ({1:0.2f}%)".format(len(y_over[y_over[:] == 1]), (len(y_over[y_over[:] == 1])/len(y_over)) * 100))
print("over Training Pass : {0} ({1:0.2f}%)".format(len(y_over[y_over[:] == 0]), (len(y_over[y_over[:] == 0])/len(y_over)) * 100))
over Training Fail : 511 (33.31%) over Training Pass : 1023 (66.69%)
# Initializaing various classification algorithms with over sampled dataset and choosing the best model based on f1 score
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=69,shuffle=True)
cv_results = cross_val_score(model, X_over, y_over, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 87.646646% (3.358769%) KNN: 88.126508% (2.327381%) GNB: 65.429189% (4.253808%) SVM: 88.401029% (3.418731%) DT: 93.048645% (2.367218%) RF: 100.000000% (0.000000%) AB: 91.826643% (2.870384%) GBT: 99.131815% (0.799062%) XGB: 99.611650% (0.475629%) LightGBM: 100.000000% (0.000000%)
oversample = ADASYN(sampling_strategy=0.5)
X_adasyn, y_adasyn = oversample.fit_resample(X_train, Y_train)
print("ADASYN Training Fail : {0} ({1:0.2f}%)".format(len(y_adasyn[y_adasyn[:] == 1]), (len(y_adasyn[y_adasyn[:] == 1])/len(y_adasyn)) * 100))
print("ADASYN Training Pass : {0} ({1:0.2f}%)".format(len(y_adasyn[y_adasyn[:] == 0]), (len(y_adasyn[y_adasyn[:] == 0])/len(y_adasyn)) * 100))
ADASYN Training Fail : 526 (33.96%) ADASYN Training Pass : 1023 (66.04%)
# Initializaing various classification algorithms with ADASYN dataset and choosing the best model based on f1 score
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=33,shuffle=True)
cv_results = cross_val_score(model, X_adasyn, y_adasyn, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 87.578994% (3.223106%) KNN: 54.887473% (0.984854%) GNB: 79.871638% (3.735608%) SVM: 89.379277% (3.038395%) DT: 84.304979% (3.897509%) RF: 97.452531% (1.196681%) AB: 87.813766% (2.749315%) GBT: 95.342101% (1.619089%) XGB: 97.517472% (1.412383%) LightGBM: 97.766353% (1.260817%)
nb = GaussianNB()
nb.fit(X_train, Y_train)
GaussianNB()
modelnb_score = nb.score(X_train, Y_train)
print('Accuracy Score of Training Data: ', modelnb_score)
Accuracy Score of Training Data: 0.864963503649635
y_predictnb= nb.predict(X_test)
modelnb_score = accuracy_score(Y_test, y_predictnb)
print('Accuracy Score of Test Data:', modelnb_score)
Accuracy Score of Test Data: 0.8471337579617835
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictnb, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.21 0.48 0.29 31
0 0.96 0.87 0.91 440
accuracy 0.85 471
macro avg 0.59 0.68 0.60 471
weighted avg 0.91 0.85 0.87 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictnb)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.xlabel('Predicted Classes', fontsize = 15)
plt.ylabel('Actual Classes', fontsize = 15)
plt.title('Confusion Matrix for GNB', fontsize = 15);
#Plotting ROC and AUC
probs = nb.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_nb = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='Gaussian Naive Bayes (AUC = %0.2f)' % roc_auc_nb)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 42 0.304545 0.709677 0.695455 0.014223 0.016154
# store the predicted probabilities for Failed Class.
y_pred_prob = nb.predict_proba(X_test)[:, 1]
# predict diabetes if the predicted probability is greater than 0.0161
y_pred_class = binarize([y_pred_prob], 0.0161)[0]
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for GNB', fontsize = 15);
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.14 0.71 0.24 31
0 0.97 0.70 0.81 440
accuracy 0.70 471
macro avg 0.56 0.70 0.52 471
weighted avg 0.92 0.70 0.77 471
precision_nb, recall_nb, f1_score_nb, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_nb)
print('Recall Score :', '%0.2f' % recall_nb)
print('F1-Score:', '%0.2f' % f1_score_nb)
nb_acc= accuracy_score(Y_test, y_predictnb)
print('Accuracy Score :','%0.2f' % nb_acc)
print('AUC :','%0.2f' % roc_auc_nb)
print('Thresholdnb :','%0.2f' % 0.016)
Thresholdnb=0.016
Precision Score : 0.56 Recall Score : 0.70 F1-Score: 0.52 Accuracy Score : 0.85 AUC : 0.73 Thresholdnb : 0.02
nbu = GaussianNB()
nbu.fit(X_under, y_under)
GaussianNB()
modelnbu_score = nbu.score(X_under,y_under)
print('Accuracy Score of Training Data: ', modelnbu_score)
Accuracy Score of Training Data: 0.7671232876712328
y_predictnbu= nbu.predict(X_test)
modelnbu_score = accuracy_score(Y_test, y_predictnbu)
print('Accuracy Score of Test Data:', modelnbu_score)
Accuracy Score of Test Data: 0.6857749469214437
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictnbu, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.12 0.61 0.20 31
0 0.96 0.69 0.80 440
accuracy 0.69 471
macro avg 0.54 0.65 0.50 471
weighted avg 0.91 0.69 0.76 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictnbu)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for GNB Under sampled', fontsize = 15);
#Plotting ROC and AUC
probs = nbu.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_nbu = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='Gaussian NB under sampled (AUC = %0.2f)' % roc_auc_nbu)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 40 0.313636 0.677419 0.686364 -0.008944 0.475317
# store the predicted probabilities for failed class
y_pred_prob = nbu.predict_proba(X_test)[:, 1]
# predict fail if the predicted probability is greater than 0.4753
y_pred_class = binarize([y_pred_prob], 0.4753)[0]
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for GNB Under sampled', fontsize = 15);
precision_nbu, recall_nbu, f1_score_nbu, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_nbu)
print('Recall Score :', '%0.2f' % recall_nbu)
print('F1-Score:', '%0.2f' % f1_score_nbu)
nbu_acc= accuracy_score(Y_test, y_predictnbu)
print('Accuracy Score :','%0.2f' % nbu_acc)
print('AUC :','%0.2f' % roc_auc_nbu)
print('Thresholdnbu:','%0.2f' % 0.4753)
Thresholdnbu=0.4753
Precision Score : 0.55 Recall Score : 0.68 F1-Score: 0.51 Accuracy Score : 0.69 AUC : 0.71 Thresholdnbu: 0.48
param_test ={'num_leaves': sp_randint(6, 50),
'min_child_samples': sp_randint(100, 500),
'min_child_weight': [1e-5, 1e-3, 1e-2, 1e-1, 1, 1e1, 1e2, 1e3, 1e4],
'subsample': sp_uniform(loc=0.2, scale=0.8),
'colsample_bytree': sp_uniform(loc=0.4, scale=0.6),
'reg_alpha': [0, 1e-1, 1, 2, 5, 7, 10, 50, 100],
'reg_lambda': [0, 1e-1, 1, 5, 10, 20, 50, 100]}
sample = 100
#n_estimators is set to a "large value". The actual number of trees build will depend on early stopping and 5000 define only the absolute maximum
lgb = LGBMClassifier(max_depth=-1, random_state=31, silent=True, metric='f1', n_jobs=4, n_estimators=2000)
gs = RandomizedSearchCV(
estimator=lgb, param_distributions=param_test,
n_iter=sample,
scoring='f1',
cv=5,
refit=True,
random_state=314,
verbose=True)
gs.fit(X_SMOTE, y_SMOTE)
gs.best_params_
Fitting 5 folds for each of 100 candidates, totalling 500 fits
{'colsample_bytree': 0.952164731370897,
'min_child_samples': 111,
'min_child_weight': 0.01,
'num_leaves': 38,
'reg_alpha': 0,
'reg_lambda': 0.1,
'subsample': 0.3029313662262354}
lgb=LGBMClassifier(colsample_bytree=0.95,
min_child_samples= 111,
min_child_weight= 0.01,
num_leaves= 38,
reg_alpha= 0,
reg_lambda= 0.1,
subsample=0.30)
lgb.fit(X_SMOTE,y_SMOTE)
LGBMClassifier(colsample_bytree=0.95, min_child_samples=111,
min_child_weight=0.01, num_leaves=38, reg_alpha=0,
reg_lambda=0.1, subsample=0.3)
modellgb1=lgb.score(X_SMOTE,y_SMOTE)
print('Accuracy Score of Training Data: ', modellgb1)
Accuracy Score of Training Data: 1.0
y_predictlg1= lgb.predict(X_test)
modellg1 = accuracy_score(Y_test, y_predictlg1)
print('Accuracy Score of Test Data:', modellg1)
Accuracy Score of Test Data: 0.9256900212314225
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictlg1, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.17 0.03 0.05 31
0 0.94 0.99 0.96 440
accuracy 0.93 471
macro avg 0.55 0.51 0.51 471
weighted avg 0.88 0.93 0.90 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictlg1)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for LGBM Smote', fontsize = 15);
#Plotting ROC and AUC
probs = lgb.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_lg = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='LGBM Smote sampled (AUC = %0.2f)' % roc_auc_lg)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 38 0.388636 0.612903 0.611364 0.00154 0.031447
# store the predicted probabilities for failed class
y_pred_prob = lgb.predict_proba(X_test)[:, 1]
# predict fail if the predicted probability is greater than 0.0314
y_pred_class = binarize([y_pred_prob], 0.0314)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.10 0.61 0.17 31
0 0.96 0.61 0.75 440
accuracy 0.61 471
macro avg 0.53 0.61 0.46 471
weighted avg 0.90 0.61 0.71 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for LGBM Smote', fontsize = 15);
precision_lg, recall_lg, f1_score_lg, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_lg)
print('Recall Score :', '%0.2f' % recall_lg)
print('F1-Score:', '%0.2f' % f1_score_lg)
lg_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % lg_acc)
print('AUC :','%0.2f' % roc_auc_lg)
print('Thresholdlg :','%0.2f' % 0.0314)
Thresholdlg=0.0314
Precision Score : 0.53 Recall Score : 0.61 F1-Score: 0.46 Accuracy Score : 0.61 AUC : 0.68 Thresholdlg : 0.03
# Number of trees in random forest
n_estimators = [int(x) for x in np.linspace(start = 50, stop = 500, num = 50)]
# Number of features to consider at every split
max_features = ['auto', 'sqrt']
# Maximum number of levels in tree
max_depth = [int(x) for x in np.linspace(10, 110, num = 11)]
max_depth.append(None)
# Minimum number of samples required to split a node
min_samples_split = range(2,100,5)
# Minimum number of samples required at each leaf node
min_samples_leaf = range(1,100,10)
# Method of selecting samples for training each tree
bootstrap = [True, False]
# Create the random grid
random_grid = {'n_estimators': n_estimators,
'max_features': max_features,
'max_depth': max_depth,
'min_samples_split': min_samples_split,
'min_samples_leaf': min_samples_leaf,
'bootstrap': bootstrap,
'criterion':['gini','entropy']}
rf = RandomForestClassifier()
rf_random = RandomizedSearchCV(estimator = rf, param_distributions = random_grid, cv = 5, verbose=2, random_state=90, n_jobs = -1)
rf_random.fit(X_over, y_over)
rf_random.best_params_
Fitting 5 folds for each of 10 candidates, totalling 50 fits
{'n_estimators': 463,
'min_samples_split': 82,
'min_samples_leaf': 1,
'max_features': 'sqrt',
'max_depth': 110,
'criterion': 'gini',
'bootstrap': False}
rf_grid1 = RandomForestClassifier(n_estimators=463,
min_samples_split= 82,
min_samples_leaf=1,
max_features= 'sqrt',
max_depth= 110,
criterion= 'gini',
bootstrap= False)
rf_grid1.fit(X_over, y_over)
RandomForestClassifier(bootstrap=False, max_depth=110, max_features='sqrt',
min_samples_split=82, n_estimators=463)
modelrfg1_score=rf_grid1.score(X_over,y_over)
print('Accuracy Score of Training Data: ', modelrfg1_score)
Accuracy Score of Training Data: 0.9980443285528031
y_predictrfg1= rf_grid1.predict(X_test)
modelrfg1_score = accuracy_score(Y_test, y_predictrfg1)
print('Accuracy Score of Test Data:', modelrfg1_score)
Accuracy Score of Test Data: 0.9341825902335457
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictrfg1, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.00 0.00 0.00 31
0 0.93 1.00 0.97 440
accuracy 0.93 471
macro avg 0.47 0.50 0.48 471
weighted avg 0.87 0.93 0.90 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictrfg1)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for RF Over Sampled', fontsize = 15);
#Plotting ROC and AUC
probs = rf_grid1.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_rfo = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='RF over sampled (AUC = %0.2f)' % roc_auc_rfo)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 35 0.272727 0.709677 0.727273 -0.017595 0.168882
# store the predicted probabilities for failed class
y_pred_prob = rf_grid1.predict_proba(X_test)[:, 1]
# predict fail if the predicted probability is greater than 0.1688
y_pred_class = binarize([y_pred_prob], 0.1688)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.15 0.71 0.25 31
0 0.97 0.73 0.83 440
accuracy 0.73 471
macro avg 0.56 0.72 0.54 471
weighted avg 0.92 0.73 0.79 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for RF Over Sampled', fontsize = 15);
precision_rfo, recall_rfo, f1_score_rfo, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_rfo)
print('Recall Score :', '%0.2f' % recall_rfo)
print('F1-Score:', '%0.2f' % f1_score_rfo)
rfo_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % rfo_acc)
print('AUC :','%0.2f' % roc_auc_rfo)
print('Thresholdrf :','%0.2f' % 0.1688)
Thresholdrf=0.1688
Precision Score : 0.56 Recall Score : 0.72 F1-Score: 0.54 Accuracy Score : 0.73 AUC : 0.76 Thresholdrf : 0.17
param_test ={'num_leaves': sp_randint(6, 50),
'min_child_samples': sp_randint(100, 500),
'min_child_weight': [1e-5, 1e-3, 1e-2, 1e-1, 1, 1e1, 1e2, 1e3, 1e4],
'subsample': sp_uniform(loc=0.2, scale=0.8),
'colsample_bytree': sp_uniform(loc=0.4, scale=0.6),
'reg_alpha': [0, 1e-1, 1, 2, 5, 7, 10, 50, 100],
'reg_lambda': [0, 1e-1, 1, 5, 10, 20, 50, 100]}
sample = 100
#n_estimators is set to a "large value". The actual number of trees build will depend on early stopping and 5000 define only the absolute maximum
lgb = LGBMClassifier(max_depth=-1, random_state=31, silent=True, metric='f1', n_jobs=4, n_estimators=2000)
gs = RandomizedSearchCV(
estimator=lgb, param_distributions=param_test,
n_iter=sample,
scoring='f1',
cv=5,
refit=True,
random_state=314,
verbose=True)
gs.fit(X_adasyn, y_adasyn)
gs.best_params_
Fitting 5 folds for each of 100 candidates, totalling 500 fits
{'colsample_bytree': 0.952164731370897,
'min_child_samples': 111,
'min_child_weight': 0.01,
'num_leaves': 38,
'reg_alpha': 0,
'reg_lambda': 0.1,
'subsample': 0.3029313662262354}
lgb=LGBMClassifier(colsample_bytree=0.95,
min_child_samples= 111,
min_child_weight= 0.01,
num_leaves= 38,
reg_alpha= 0,
reg_lambda= 0.1,
subsample=0.30)
lgb.fit(X_adasyn,y_adasyn)
LGBMClassifier(colsample_bytree=0.95, min_child_samples=111,
min_child_weight=0.01, num_leaves=38, reg_alpha=0,
reg_lambda=0.1, subsample=0.3)
modellgb=lgb.score(X_adasyn,y_adasyn)
print('Accuracy Score of Training Data: ', modellgb)
Accuracy Score of Training Data: 1.0
y_predictlg= lgb.predict(X_test)
modellg = accuracy_score(Y_test, y_predictlg)
print('Accuracy Score of Test Data:', modellg)
Accuracy Score of Test Data: 0.9256900212314225
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictlg, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.00 0.00 0.00 31
0 0.93 0.99 0.96 440
accuracy 0.93 471
macro avg 0.47 0.50 0.48 471
weighted avg 0.87 0.93 0.90 471
#Plotting ROC and AUC
probs = lgb.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_lg1 = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='LGBM ADASYN (AUC = %0.2f)' % roc_auc_lg1)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 43 0.304545 0.709677 0.695455 0.014223 0.041761
# store the predicted probabilities for failed class
y_pred_prob = lgb.predict_proba(X_test)[:, 1]
# predict fail for predicted probability is greater than 0.04176
y_pred_class = binarize([y_pred_prob], 0.0417)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.14 0.71 0.24 31
0 0.97 0.70 0.81 440
accuracy 0.70 471
macro avg 0.56 0.70 0.52 471
weighted avg 0.92 0.70 0.77 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for LGBM ADASYN', fontsize = 15);
precision_lg1, recall_lg1, f1_score_lg1, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_lg1)
print('Recall Score :', '%0.2f' % recall_lg1)
print('F1-Score:', '%0.2f' % f1_score_lg1)
lg1_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % lg1_acc)
print('AUC :','%0.2f' % roc_auc_lg1)
print('Thresholdlg1 :','%0.2f' % 0.0417)
Thresholdlg1=0.0417
Precision Score : 0.56 Recall Score : 0.70 F1-Score: 0.52 Accuracy Score : 0.70 AUC : 0.74 Thresholdlg1 : 0.04
modellists = []
modellists.append(['Gaussian NB Normal Data', nb_acc * 100, recall_nb * 100, precision_nb * 100,roc_auc_nb*100,f1_score_nb*100,Thresholdnb])
modellists.append(['Gausian NB under samples data', nbu_acc* 100, recall_nbu * 100, precision_nbu* 100,roc_auc_nbu*100,f1_score_nbu*100,Thresholdnbu])
modellists.append(['LGBM Smote sampled Data', lg_acc * 100, recall_lg * 100, precision_lg * 100,roc_auc_lg*100,f1_score_lg*100,Thresholdlg])
modellists.append(['Random Forest Over sampled Data', rfo_acc * 100, recall_rfo * 100, precision_rfo * 100,roc_auc_rfo*100,f1_score_rfo*100,Thresholdrf])
modellists.append(['LGBM ADASYN sampled Data', lg1_acc * 100, recall_lg1 * 100, precision_lg1 * 100,roc_auc_lg1*100,f1_score_lg1*100,Thresholdlg1])
model_df = pd.DataFrame(modellists, columns = ['Model', 'Accuracy Scores on Test', 'Recall Score', 'Precision Score','AUC','F1 Score','Threshold'])
model_df
| Model | Accuracy Scores on Test | Recall Score | Precision Score | AUC | F1 Score | Threshold | |
|---|---|---|---|---|---|---|---|
| 0 | Gaussian NB Normal Data | 84.713376 | 70.256598 | 55.622711 | 73.196481 | 52.294507 | 0.0160 |
| 1 | Gausian NB under samples data | 68.577495 | 68.189150 | 55.001209 | 70.689150 | 51.212206 | 0.4753 |
| 2 | LGBM Smote sampled Data | 61.146497 | 61.213343 | 52.864769 | 68.218475 | 45.906578 | 0.0314 |
| 3 | Random Forest Over sampled Data | 72.611465 | 71.847507 | 56.378698 | 75.916422 | 54.329247 | 0.1688 |
| 4 | LGBM ADASYN sampled Data | 69.639066 | 70.256598 | 55.622711 | 74.024927 | 52.294507 | 0.0417 |
#making copies of validation dataset
val1=val.copy()
val1=val1.reset_index(drop=True)
val1=pd.DataFrame(val1)
val2=val.copy()
val2=val2.reset_index(drop=True)
val2=pd.DataFrame(val2)
val3=val.copy()
val3=val3.reset_index(drop=True)
val3=pd.DataFrame(val3)
val4=val.copy()
val4=val4.reset_index(drop=True)
val4=pd.DataFrame(val4)
#Fitting Random forest
rf_grid1.fit(X_over, y_over)
pred=rf_grid1.predict(val1)
val1['Pass/Fail'] = pred
val1 = val1[(val1['Pass/Fail'] == 1)]
val1.head()
| 0 | 1 | 2 | 3 | 4 | 8 | 9 | 10 | 11 | 14 | 15 | 16 | 19 | 20 | 21 | 23 | 24 | 25 | 28 | 29 | 31 | 32 | 33 | 36 | 40 | 41 | 43 | 44 | 45 | 47 | 53 | 59 | 60 | 62 | 63 | 64 | 67 | 71 | 72 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 99 | 100 | 102 | 103 | 107 | 108 | 109 | 112 | 113 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 126 | 129 | 130 | 134 | 135 | 136 | 137 | 138 | 139 | 142 | 143 | 144 | 145 | 146 | 150 | 151 | 153 | 155 | 156 | 159 | 160 | 161 | 162 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 175 | 176 | 177 | 180 | 181 | 182 | 183 | 184 | 188 | 195 | 200 | 201 | 208 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 221 | 222 | 223 | 225 | 227 | 228 | 238 | 239 | 244 | 247 | 250 | 251 | 253 | 255 | 267 | 268 | 269 | 345 | 418 | 419 | 423 | 432 | 433 | 438 | 460 | 468 | 472 | 476 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 499 | 500 | 510 | 511 | 542 | 543 | 544 | 546 | 558 | 559 | 562 | 565 | 570 | 571 | 572 | 578 | 582 | 583 | 586 | 587 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 11 | -1.30503 | -0.295197 | -0.196644 | 0.787541 | -0.89971 | 0.065423 | 1.319303 | -1.870629 | -2.187508 | 0.055 | 0.872225 | 0.615534 | 0.386885 | 1.135829 | -0.97941 | -0.031442 | 0.515958 | -2.423074 | -1.348142 | -0.791661 | 1.526909 | 0.350516 | 2.317848 | -0.424897 | -0.064749 | 0.332195 | 0.855194 | -1.800897 | 0.975184 | 0.079905 | 1.276967 | 0.094912 | 1.248753 | 2.000396 | -0.920018 | -1.304138 | -0.502127 | -1.787665 | -0.971501 | -0.909473 | -2.083356 | 0.482763 | 0.518936 | 0.222525 | 0.444581 | -0.691323 | 0.62356 | -1.636567 | -0.237128 | -1.287214 | -1.447789 | -0.01461 | 0.69252 | -0.07639 | -0.071476 | -2.1833 | 0.86197 | 0.196065 | -0.685447 | 0.013117 | 0.071631 | 0.04062 | 1.444477 | -0.016233 | 0.202395 | -0.727914 | 0.968889 | -0.948976 | -0.57047 | -0.828552 | 1.248833 | -0.034356 | -1.026351 | -0.097427 | 0.887994 | -1.402825 | 0.290266 | -0.587076 | 0.71748 | 2.304036 | 0.587452 | -0.459007 | 0.546485 | -0.617087 | -0.668174 | -0.429481 | 0.73671 | 0.116571 | 2.334591 | -1.154209 | -0.666745 | 0.087319 | 1.1083 | 1.113267 | -0.664708 | -0.075114 | 0.750902 | -1.175998 | -0.76128 | -0.487023 | -0.742421 | -1.61883 | -1.607628 | -1.11917 | 0.063005 | 1.618044 | -0.59743 | 0.771738 | 0.042392 | -0.662651 | -1.256061 | -0.647066 | 0.780113 | 1.454023 | 0.111617 | 1.405453 | -0.246536 | -0.789565 | -1.706823 | 0.049191 | 2.565222 | 1.111234 | 1.493219 | 1.109714 | 0.693491 | -1.563959 | -2.182179 | -0.250519 | 0.25034 | -1.992389 | 0.577076 | 0.43486 | -0.467304 | -0.572257 | -0.06857 | -0.132303 | -0.497814 | 2.289562 | -0.580445 | 1.837681 | -1.350979 | -1.154603 | 0.048101 | 0.467965 | -1.161192 | 1.064915 | 0.43864 | -0.875779 | 0.869232 | -0.21969 | -0.901517 | -1.115397 | -0.875037 | -0.387987 | 0.639047 | -0.977465 | 0.219717 | -0.157392 | 1.059078 | -0.727031 | -1.156699 | -1.022856 | -1.034019 | -0.89954 | -0.808767 | 0.105296 | 2.073032 | -0.744957 | 1.018369 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 0.928824 | 2.083586 | 0.606205 | 0.198621 | -0.298067 | 0.416107 | 0.241773 | -0.534322 | 2.526623 | -0.426818 | -0.027964 | 1.613318 | 1.909378 | -0.022017 | 1 |
#fitting Random Forest with threshold
# store the predicted probabilities for failed class
y_pred_prob = rf_grid1.predict_proba(val2)[:, 1]
# predict fail if the predicted probability is greater than 0.1688
pred = binarize([y_pred_prob], 0.1688)[0]
val2['Pass/Fail'] = pred
val2 = val2[(val2['Pass/Fail'] == 1)]
val2.head(18)
| 0 | 1 | 2 | 3 | 4 | 8 | 9 | 10 | 11 | 14 | 15 | 16 | 19 | 20 | 21 | 23 | 24 | 25 | 28 | 29 | 31 | 32 | 33 | 36 | 40 | 41 | 43 | 44 | 45 | 47 | 53 | 59 | 60 | 62 | 63 | 64 | 67 | 71 | 72 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 99 | 100 | 102 | 103 | 107 | 108 | 109 | 112 | 113 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 126 | 129 | 130 | 134 | 135 | 136 | 137 | 138 | 139 | 142 | 143 | 144 | 145 | 146 | 150 | 151 | 153 | 155 | 156 | 159 | 160 | 161 | 162 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 175 | 176 | 177 | 180 | 181 | 182 | 183 | 184 | 188 | 195 | 200 | 201 | 208 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 221 | 222 | 223 | 225 | 227 | 228 | 238 | 239 | 244 | 247 | 250 | 251 | 253 | 255 | 267 | 268 | 269 | 345 | 418 | 419 | 423 | 432 | 433 | 438 | 460 | 468 | 472 | 476 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 499 | 500 | 510 | 511 | 542 | 543 | 544 | 546 | 558 | 559 | 562 | 565 | 570 | 571 | 572 | 578 | 582 | 583 | 586 | 587 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0.344218 | 1.114103 | -0.524446 | 0.204352 | 0.195255 | 0.501801 | 1.261706 | -0.425113 | -2.078819 | -0.373819 | 0.275959 | 0.446257 | 0.271668 | -0.244093 | 0.553955 | -0.271451 | 0.515147 | -2.475967 | -1.501610 | -0.905916 | 1.492211 | -1.914492 | 2.002942 | -1.050316 | -0.064749 | 2.733885 | -0.455997 | 0.925689 | -0.839151 | 0.685231 | -0.152223 | -0.604676 | -0.738225 | -1.278132 | 0.712533 | 0.463593 | -0.517187 | -0.762499 | -0.971501 | 1.009951 | 0.253681 | 0.800331 | -0.364710 | -0.476870 | 0.389503 | -0.067761 | -0.267100 | -0.476291 | -0.237128 | -0.387834 | -1.501093 | -1.279911 | -0.107722 | -0.409134 | -0.071476 | -2.632195 | 0.027109 | 0.955879 | -0.685447 | -0.575194 | 0.779017 | 2.813684 | 2.067809 | -0.016233 | 0.048878 | -0.727914 | -1.080001 | 0.024058 | -0.013492 | 0.476438 | -0.159631 | 0.220173 | 1.337804 | 0.719710 | 0.887994 | -1.428843 | 0.055128 | 0.233720 | -0.273047 | 0.004819 | 0.587452 | -0.486624 | -0.895716 | -1.249473 | -0.043750 | 0.286181 | -0.652399 | -0.515403 | 1.605143 | -1.201496 | -0.820666 | -0.224221 | 1.108300 | 0.560222 | -0.005062 | 1.121371 | 2.565687 | -0.837389 | -0.697976 | -0.888514 | -0.742421 | -1.296101 | -1.779961 | 0.257970 | 0.097569 | -0.042626 | 0.149872 | 1.615296 | 2.092509 | -1.360147 | 0.658591 | -1.010102 | 0.508632 | -1.493967 | -1.195579 | 1.050619 | 0.627646 | -1.603285 | -0.605043 | 0.944603 | -0.162945 | 0.447664 | 1.135719 | -0.016825 | -0.118143 | -1.028732 | -2.506346 | 1.663684 | -0.201230 | -1.992389 | 0.463935 | -0.299086 | 1.676913 | 2.404961 | -0.068570 | -0.132303 | -1.433799 | -0.403915 | -0.580445 | -0.852754 | -0.065327 | 0.448731 | -1.309929 | 0.205742 | -0.754828 | 2.835738 | 0.824425 | -0.875779 | 0.715120 | -0.947360 | -0.808801 | -0.778216 | -0.708694 | -0.340338 | 0.231087 | 0.600733 | -1.644032 | 1.881032 | 1.057210 | 1.130079 | 2.236905 | 0.176529 | 1.899359 | -0.899540 | -1.156713 | -1.278179 | -0.809559 | -0.744957 | 0.843953 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 0.440940 | 2.083586 | 0.606205 | 0.211403 | -1.511814 | 0.458207 | -0.119068 | 0.192281 | -0.677800 | 0.129037 | -0.655763 | -2.018632 | -2.340171 | -1.685870 | 1.0 |
| 2 | -1.244126 | 1.046470 | -0.575570 | 1.063986 | 0.646105 | -0.286450 | 0.390554 | 0.141048 | -0.339807 | 0.206534 | 0.493627 | -1.323982 | 0.609427 | 0.483365 | 0.489006 | -0.275934 | -0.926830 | 0.672706 | -0.667230 | -0.107159 | 2.553189 | -0.397495 | -0.423080 | 1.222153 | -0.064749 | 0.060201 | 1.438211 | -1.003934 | -0.646612 | 0.642937 | -0.022297 | 0.094912 | 1.597074 | -0.151153 | -0.467066 | -0.977303 | -1.051308 | -0.727803 | 0.872552 | -0.112005 | -0.124787 | 0.119324 | 0.813485 | -0.304985 | -2.051569 | 0.043921 | 0.837051 | -0.701521 | 0.635055 | 1.478308 | 0.524457 | -2.552152 | -1.872358 | 1.417009 | 0.729687 | 0.106064 | -0.640780 | 0.196065 | 1.786530 | 0.013117 | 2.193789 | 0.315872 | 2.654474 | -0.452566 | -0.777962 | -0.727914 | 0.950250 | -2.182127 | -0.747281 | -0.828552 | -0.097988 | 0.414813 | 0.882063 | 1.527015 | 1.105616 | -1.141464 | 1.931007 | 0.009909 | 0.911848 | 1.105026 | -0.850623 | -0.208615 | -0.487763 | -1.375939 | -0.321518 | 1.180759 | -0.281970 | -0.972712 | -0.702680 | 0.855486 | -0.613421 | -0.661040 | 1.316664 | 1.034260 | -0.051444 | -0.267407 | 1.687724 | -0.972218 | -0.210678 | -0.754684 | 0.507566 | 0.044466 | -0.396231 | 2.533465 | -1.803485 | -0.102754 | -0.007027 | 1.779040 | 0.363020 | 0.394161 | -0.375386 | 1.011123 | 0.733495 | -0.557049 | 1.298016 | 0.802919 | -2.158809 | -0.099774 | -0.701866 | 0.273707 | -2.138357 | -0.249083 | 1.116653 | -0.548069 | -1.323658 | 0.152460 | -2.277522 | 0.235737 | -1.203946 | -1.589283 | 1.622500 | -0.299086 | 1.897463 | -0.326593 | 0.440213 | 0.177141 | 1.086160 | -0.403915 | -0.580445 | 0.034267 | -0.033678 | -0.833936 | -0.068604 | 0.118051 | -0.704865 | 1.342013 | 0.221454 | 3.462820 | -1.116340 | -0.947360 | -0.930111 | -0.915098 | -0.372807 | 0.431055 | 0.467200 | -0.909455 | 0.267181 | -0.486848 | 0.422524 | -0.061948 | -1.156699 | 0.258727 | 1.564662 | -0.591039 | -0.610179 | 0.060292 | -0.809559 | -0.744957 | -0.082742 | -0.091869 | -0.582912 | -0.374845 | 0.365666 | -0.108597 | -0.034738 | 0.262406 | 0.636267 | -0.231424 | 0.935491 | -0.512539 | 1.549409 | -0.677800 | -1.322362 | 0.568445 | -0.006336 | -0.085930 | 0.139999 | 1.0 |
| 8 | -0.677576 | 1.721635 | 1.840620 | -1.014149 | -1.523386 | 0.990821 | 0.894526 | -0.810584 | -1.654935 | 1.385175 | 0.599515 | -0.376325 | 0.386885 | 1.645800 | 0.322962 | 1.239982 | 0.445818 | 0.450554 | -1.913996 | -0.592164 | 1.326725 | 0.350516 | 2.317848 | -0.424897 | -0.064749 | 0.332195 | 0.557489 | -1.597713 | 0.817067 | -0.028472 | 1.276967 | 0.094912 | 0.535105 | 2.116279 | -1.061592 | -1.796170 | -0.664773 | -1.250098 | -0.971501 | -0.167003 | -0.746104 | -1.916637 | 0.482684 | -0.263495 | -1.624161 | -0.598254 | 0.900431 | 0.363404 | 0.801185 | 0.034026 | -0.328406 | -0.429928 | -1.528664 | -0.569492 | -0.056989 | 0.061174 | -1.753929 | 0.196065 | 1.786530 | 1.867115 | 0.425324 | -0.008679 | 1.847809 | -0.449667 | -1.354326 | -0.727914 | 0.932046 | -0.043380 | -0.014677 | 0.186440 | 0.267574 | 0.998732 | -1.069077 | 0.245595 | -0.200116 | -1.865234 | -0.415148 | 1.427532 | 0.499440 | 0.515214 | -0.766030 | -0.030026 | -0.098962 | -0.180165 | -0.670489 | 0.736356 | -0.652399 | -0.366143 | -0.275072 | -0.752270 | 0.304050 | 1.362900 | 0.774917 | 1.113267 | 0.273226 | -0.075114 | -0.028965 | 0.434045 | -0.534793 | -0.353193 | -1.492413 | -1.246450 | -1.551873 | -1.119170 | 0.063005 | 1.618044 | -0.597430 | 0.771738 | 0.042392 | -0.662651 | -2.104118 | -0.877643 | 0.223441 | 0.811828 | 0.978767 | 1.818003 | -0.246536 | -0.195644 | -0.701866 | 0.694420 | -1.777947 | 0.155694 | -0.656545 | 0.182863 | -1.753347 | 0.041724 | 0.406391 | -0.592952 | -0.764573 | 0.527027 | -0.920914 | -0.299086 | -0.586859 | -0.572754 | 3.160250 | 1.353029 | 1.374156 | -0.235573 | -0.580445 | 0.944786 | -1.145950 | -0.833936 | 0.875651 | 0.235540 | -1.224478 | 1.107995 | -0.787939 | -0.875779 | 0.615018 | 0.969976 | -0.671742 | 1.165146 | -0.136290 | -1.163093 | 0.981314 | -0.922451 | 1.087327 | -0.157392 | 0.574125 | -0.450969 | -0.803623 | -0.272428 | 1.112110 | -0.540566 | 0.069059 | -0.445253 | -0.809559 | 1.380042 | -0.082742 | 0.367067 | -0.582912 | -0.374845 | 0.365666 | -0.361787 | -0.295666 | -0.961766 | 0.350184 | 0.111715 | 0.999413 | 0.388126 | -0.924647 | 0.283527 | -0.241533 | 1.039294 | -0.929047 | -0.451482 | 0.774453 | 1.0 |
| 10 | -0.251573 | 0.851067 | -0.238748 | -0.889267 | 0.075629 | 0.694362 | -1.272553 | 0.141048 | -2.317934 | 1.077456 | -0.069394 | 1.049776 | -0.560103 | 1.023335 | 0.322962 | 2.580719 | 0.304053 | 0.450554 | -0.213269 | -1.105669 | 1.276011 | -0.397495 | -0.423080 | 1.222153 | -0.064749 | 0.060201 | 1.726755 | 1.814780 | -2.193923 | -1.180442 | 1.634265 | 0.094912 | 1.657748 | -1.362616 | -1.305984 | -2.018940 | -1.053316 | 0.647876 | 1.014438 | 0.509471 | -0.768181 | 0.313394 | -0.600349 | 0.003223 | -1.747537 | 1.840152 | 1.067221 | 0.073535 | 0.489691 | 1.478308 | 0.556439 | -1.597278 | -0.990040 | -0.963170 | 0.537002 | -1.420179 | -1.086040 | 0.196065 | 0.550541 | 0.362353 | 0.779017 | -0.181225 | 1.554477 | -0.135101 | -0.493820 | -0.727914 | 0.917309 | -0.746662 | 0.036880 | 0.186440 | -0.026030 | 0.250118 | 1.266594 | -1.170193 | -0.526549 | -0.826884 | 1.580912 | -0.587076 | 0.396026 | -0.421323 | -0.850623 | 0.608843 | 0.464128 | -0.456127 | 0.142453 | -0.267880 | -0.374577 | 0.846994 | -0.520318 | 0.051608 | 1.217362 | -0.237745 | 1.441682 | 0.955254 | 0.603048 | -0.075114 | -0.028965 | 0.436680 | -0.583186 | -0.620854 | -0.492424 | -0.129311 | -1.470776 | 2.533465 | -1.803485 | -0.102754 | -0.007027 | 1.779040 | 0.363020 | 0.394161 | -0.907053 | -1.059161 | 0.703331 | -0.495535 | -0.850444 | -0.416342 | 0.026646 | 0.019478 | -1.042416 | 1.731017 | -0.746901 | -0.030106 | 0.611387 | 0.435298 | -1.220215 | 0.628628 | 0.077457 | 1.362343 | -1.284369 | 0.527027 | 1.622500 | 2.173153 | -1.571247 | 0.330736 | -0.694766 | -0.689302 | 0.150175 | -0.488087 | -0.580445 | 2.554347 | 0.837643 | 0.448731 | -0.206529 | -0.176524 | -0.701534 | 0.044411 | -0.076075 | 1.641084 | -0.176764 | -0.947360 | 0.299568 | 2.359890 | 0.213211 | 0.004329 | 0.266006 | -0.944540 | 1.886554 | 0.931728 | -1.116932 | -0.508032 | -1.156699 | 0.068910 | -1.034019 | -0.494353 | -1.366065 | 0.294203 | 0.651618 | 1.158549 | 2.040922 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -0.195940 | -0.034738 | 2.063871 | 0.462487 | 0.223731 | -0.221897 | -1.269731 | 0.564531 | 2.396714 | -2.341430 | 0.599835 | 1.466077 | 0.782258 | -0.411947 | 1.0 |
| 11 | -1.305030 | -0.295197 | -0.196644 | 0.787541 | -0.899710 | 0.065423 | 1.319303 | -1.870629 | -2.187508 | 0.055000 | 0.872225 | 0.615534 | 0.386885 | 1.135829 | -0.979410 | -0.031442 | 0.515958 | -2.423074 | -1.348142 | -0.791661 | 1.526909 | 0.350516 | 2.317848 | -0.424897 | -0.064749 | 0.332195 | 0.855194 | -1.800897 | 0.975184 | 0.079905 | 1.276967 | 0.094912 | 1.248753 | 2.000396 | -0.920018 | -1.304138 | -0.502127 | -1.787665 | -0.971501 | -0.909473 | -2.083356 | 0.482763 | 0.518936 | 0.222525 | 0.444581 | -0.691323 | 0.623560 | -1.636567 | -0.237128 | -1.287214 | -1.447789 | -0.014610 | 0.692520 | -0.076390 | -0.071476 | -2.183300 | 0.861970 | 0.196065 | -0.685447 | 0.013117 | 0.071631 | 0.040620 | 1.444477 | -0.016233 | 0.202395 | -0.727914 | 0.968889 | -0.948976 | -0.570470 | -0.828552 | 1.248833 | -0.034356 | -1.026351 | -0.097427 | 0.887994 | -1.402825 | 0.290266 | -0.587076 | 0.717480 | 2.304036 | 0.587452 | -0.459007 | 0.546485 | -0.617087 | -0.668174 | -0.429481 | 0.736710 | 0.116571 | 2.334591 | -1.154209 | -0.666745 | 0.087319 | 1.108300 | 1.113267 | -0.664708 | -0.075114 | 0.750902 | -1.175998 | -0.761280 | -0.487023 | -0.742421 | -1.618830 | -1.607628 | -1.119170 | 0.063005 | 1.618044 | -0.597430 | 0.771738 | 0.042392 | -0.662651 | -1.256061 | -0.647066 | 0.780113 | 1.454023 | 0.111617 | 1.405453 | -0.246536 | -0.789565 | -1.706823 | 0.049191 | 2.565222 | 1.111234 | 1.493219 | 1.109714 | 0.693491 | -1.563959 | -2.182179 | -0.250519 | 0.250340 | -1.992389 | 0.577076 | 0.434860 | -0.467304 | -0.572257 | -0.068570 | -0.132303 | -0.497814 | 2.289562 | -0.580445 | 1.837681 | -1.350979 | -1.154603 | 0.048101 | 0.467965 | -1.161192 | 1.064915 | 0.438640 | -0.875779 | 0.869232 | -0.219690 | -0.901517 | -1.115397 | -0.875037 | -0.387987 | 0.639047 | -0.977465 | 0.219717 | -0.157392 | 1.059078 | -0.727031 | -1.156699 | -1.022856 | -1.034019 | -0.899540 | -0.808767 | 0.105296 | 2.073032 | -0.744957 | 1.018369 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 0.928824 | 2.083586 | 0.606205 | 0.198621 | -0.298067 | 0.416107 | 0.241773 | -0.534322 | 2.526623 | -0.426818 | -0.027964 | 1.613318 | 1.909378 | -0.022017 | 1.0 |
| 14 | -0.737510 | 2.204895 | 0.901897 | -1.184030 | -0.008529 | -0.233807 | 1.132113 | -1.449020 | -0.698478 | 0.295974 | 1.164467 | -0.187133 | -0.687946 | -0.221595 | 0.322962 | 0.036146 | 0.180397 | 0.450554 | -2.000311 | 0.121094 | 0.707487 | -0.023434 | 1.116489 | 1.041433 | -1.335374 | -2.019537 | -0.199795 | -1.065773 | 0.223319 | -1.481255 | -1.256597 | 0.094912 | -0.328766 | -0.104216 | -0.728826 | 0.628529 | 0.404472 | -1.031013 | 1.016717 | -0.513489 | -1.042570 | 0.443949 | -0.913024 | 0.015078 | 1.832555 | 0.248673 | 0.780342 | -0.218943 | 0.178197 | 1.982196 | -0.040565 | 0.620411 | 0.159026 | -0.467493 | -0.216352 | 1.587417 | -0.919067 | -0.563748 | -0.685447 | -0.968963 | 1.132710 | -0.616699 | 2.104475 | -0.540993 | 0.434018 | -0.727914 | 0.898237 | -2.211029 | -0.317092 | 0.041441 | -2.452347 | -0.393691 | -1.026351 | 1.090043 | -0.308927 | -0.956973 | -0.796595 | 0.308323 | 0.247759 | -0.233742 | -1.654253 | 1.315834 | 1.454324 | 0.808645 | -0.386708 | 0.984529 | 0.088459 | -1.045754 | 2.542106 | -0.421262 | 0.426203 | -0.227068 | 0.545717 | 1.113267 | -0.072058 | -0.075114 | -0.028965 | 0.495531 | -0.653242 | -0.620854 | -0.492424 | -0.327913 | -1.369403 | -0.557618 | 1.020938 | 0.717560 | 0.159169 | 0.426771 | -0.653208 | 0.108822 | -0.127493 | 0.898288 | -0.796669 | -0.550096 | -0.787889 | -0.510037 | 1.775010 | 0.042861 | -1.095836 | -0.211608 | 2.533287 | 0.520657 | 2.923217 | 1.290564 | 0.554240 | -1.047188 | -0.904579 | 0.735690 | -0.353691 | -0.480740 | 1.264974 | -0.299086 | 1.127051 | -0.028328 | -0.675197 | -0.792450 | 0.294173 | -0.319744 | -0.580445 | 0.034267 | 1.104618 | 0.128064 | -0.089824 | -0.754606 | -1.450976 | 2.498806 | 0.538718 | 0.596826 | 1.299165 | 0.236883 | 0.143962 | -0.216823 | 1.237636 | -0.500700 | -0.327908 | -0.905275 | 0.755421 | -0.184360 | 2.080267 | -0.513625 | -1.156699 | 0.074306 | -0.503939 | -0.558274 | 0.203292 | 0.435457 | 0.251655 | 1.414997 | -0.082742 | 1.380837 | -0.582912 | -0.374845 | 0.365666 | 0.441762 | -2.053497 | -1.018464 | 0.211403 | -1.511814 | -0.031634 | 0.694265 | 1.675514 | 1.071642 | 2.352458 | -1.409121 | -0.035784 | 2.061691 | 1.463184 | 1.0 |
| 15 | -0.344301 | 0.818583 | 0.901897 | -1.184030 | -0.008529 | 1.139051 | 1.895271 | -0.304653 | -0.328938 | 1.293168 | 1.447275 | -0.650519 | -0.624814 | 0.220880 | 0.322962 | 2.474508 | 0.153098 | 0.450554 | -2.364762 | -0.335412 | 0.989080 | -0.023434 | 1.116489 | 1.041433 | -1.335374 | -2.019537 | -0.467660 | -1.357930 | 0.628920 | -1.450064 | -0.509520 | 0.094912 | -0.549959 | 0.832062 | -1.765415 | -0.093801 | -0.718988 | 0.648455 | 1.122547 | -0.265999 | -0.862798 | 0.228709 | -0.888100 | 0.015078 | 2.251151 | 0.397583 | 1.037199 | 0.163065 | -1.628468 | 1.982196 | 0.801637 | -0.665338 | 0.159026 | 0.258080 | 0.467462 | 2.170981 | -0.306836 | -0.563748 | -0.685447 | 0.535799 | 2.901175 | 0.139218 | 2.471141 | -0.238024 | -0.287784 | -0.727914 | 0.953285 | -0.688858 | 0.335467 | 0.041441 | -1.704645 | 0.280062 | -1.211496 | 0.890129 | 0.561561 | -0.690881 | 0.238013 | 0.233720 | -0.020120 | 0.073704 | -0.596845 | 0.119105 | 0.077243 | 0.808645 | -0.386708 | 0.984529 | 0.088459 | -0.445537 | -0.048692 | -1.138447 | 0.262072 | 0.577525 | -0.183558 | 0.244197 | -0.654401 | -0.075114 | -0.028965 | 0.663299 | -0.726955 | 0.315957 | -0.742421 | 0.069292 | -1.526530 | -0.557618 | 1.020938 | 0.717560 | 0.159169 | 0.426771 | -0.653208 | 0.108822 | -1.422410 | -0.818772 | -0.903616 | -0.366310 | -0.964770 | -0.612880 | -0.100839 | -1.579903 | -1.366273 | -0.319506 | -0.085389 | -0.016834 | 2.327384 | 0.186630 | 0.355310 | -1.700534 | -0.098928 | 0.194645 | -0.875202 | 0.023144 | 1.264974 | -0.839889 | 0.594652 | 1.034718 | 0.107547 | -0.070414 | 2.526137 | -0.993114 | -0.580445 | 3.294510 | 0.976322 | 0.128064 | 0.164807 | -0.975963 | -1.157861 | 1.685062 | -1.405336 | 0.577158 | 0.404365 | -0.947360 | -0.662026 | 2.676662 | 0.828150 | -0.559715 | -0.621216 | -0.941628 | 0.859906 | 0.487374 | -1.116932 | 0.360996 | -1.156699 | -0.271233 | -0.588571 | -0.727765 | 0.844124 | -0.191212 | 0.227137 | -0.744957 | -0.082742 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -0.196556 | -1.953933 | -1.066916 | 0.211403 | -1.511814 | 1.377339 | 0.843976 | -0.059928 | 0.829145 | 1.765722 | -1.911361 | -0.310634 | 0.310085 | 0.523732 | 1.0 |
#fiiting naive bayes on validation dataset
nb = GaussianNB()
nb.fit(X_train, Y_train)
pred=nb.predict(val3)
val3['Pass/Fail'] = pred
val3 = val3[(val3['Pass/Fail'] == 1)]
val3.head(18)
| 0 | 1 | 2 | 3 | 4 | 8 | 9 | 10 | 11 | 14 | 15 | 16 | 19 | 20 | 21 | 23 | 24 | 25 | 28 | 29 | 31 | 32 | 33 | 36 | 40 | 41 | 43 | 44 | 45 | 47 | 53 | 59 | 60 | 62 | 63 | 64 | 67 | 71 | 72 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 99 | 100 | 102 | 103 | 107 | 108 | 109 | 112 | 113 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 126 | 129 | 130 | 134 | 135 | 136 | 137 | 138 | 139 | 142 | 143 | 144 | 145 | 146 | 150 | 151 | 153 | 155 | 156 | 159 | 160 | 161 | 162 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 175 | 176 | 177 | 180 | 181 | 182 | 183 | 184 | 188 | 195 | 200 | 201 | 208 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 221 | 222 | 223 | 225 | 227 | 228 | 238 | 239 | 244 | 247 | 250 | 251 | 253 | 255 | 267 | 268 | 269 | 345 | 418 | 419 | 423 | 432 | 433 | 438 | 460 | 468 | 472 | 476 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 499 | 500 | 510 | 511 | 542 | 543 | 544 | 546 | 558 | 559 | 562 | 565 | 570 | 571 | 572 | 578 | 582 | 583 | 586 | 587 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2 | -1.244126 | 1.046470 | -0.575570 | 1.063986 | 0.646105 | -0.286450 | 0.390554 | 0.141048 | -0.339807 | 0.206534 | 0.493627 | -1.323982 | 0.609427 | 0.483365 | 0.489006 | -0.275934 | -0.926830 | 0.672706 | -0.667230 | -0.107159 | 2.553189 | -0.397495 | -0.423080 | 1.222153 | -0.064749 | 0.060201 | 1.438211 | -1.003934 | -0.646612 | 0.642937 | -0.022297 | 0.094912 | 1.597074 | -0.151153 | -0.467066 | -0.977303 | -1.051308 | -0.727803 | 0.872552 | -0.112005 | -0.124787 | 0.119324 | 0.813485 | -0.304985 | -2.051569 | 0.043921 | 0.837051 | -0.701521 | 0.635055 | 1.478308 | 0.524457 | -2.552152 | -1.872358 | 1.417009 | 0.729687 | 0.106064 | -0.640780 | 0.196065 | 1.786530 | 0.013117 | 2.193789 | 0.315872 | 2.654474 | -0.452566 | -0.777962 | -0.727914 | 0.950250 | -2.182127 | -0.747281 | -0.828552 | -0.097988 | 0.414813 | 0.882063 | 1.527015 | 1.105616 | -1.141464 | 1.931007 | 0.009909 | 0.911848 | 1.105026 | -0.850623 | -0.208615 | -0.487763 | -1.375939 | -0.321518 | 1.180759 | -0.281970 | -0.972712 | -0.702680 | 0.855486 | -0.613421 | -0.661040 | 1.316664 | 1.034260 | -0.051444 | -0.267407 | 1.687724 | -0.972218 | -0.210678 | -0.754684 | 0.507566 | 0.044466 | -0.396231 | 2.533465 | -1.803485 | -0.102754 | -0.007027 | 1.779040 | 0.363020 | 0.394161 | -0.375386 | 1.011123 | 0.733495 | -0.557049 | 1.298016 | 0.802919 | -2.158809 | -0.099774 | -0.701866 | 0.273707 | -2.138357 | -0.249083 | 1.116653 | -0.548069 | -1.323658 | 0.152460 | -2.277522 | 0.235737 | -1.203946 | -1.589283 | 1.622500 | -0.299086 | 1.897463 | -0.326593 | 0.440213 | 0.177141 | 1.086160 | -0.403915 | -0.580445 | 0.034267 | -0.033678 | -0.833936 | -0.068604 | 0.118051 | -0.704865 | 1.342013 | 0.221454 | 3.462820 | -1.116340 | -0.947360 | -0.930111 | -0.915098 | -0.372807 | 0.431055 | 0.467200 | -0.909455 | 0.267181 | -0.486848 | 0.422524 | -0.061948 | -1.156699 | 0.258727 | 1.564662 | -0.591039 | -0.610179 | 0.060292 | -0.809559 | -0.744957 | -0.082742 | -0.091869 | -0.582912 | -0.374845 | 0.365666 | -0.108597 | -0.034738 | 0.262406 | 0.636267 | -0.231424 | 0.935491 | -0.512539 | 1.549409 | -0.677800 | -1.322362 | 0.568445 | -0.006336 | -0.085930 | 0.139999 | 1 |
| 5 | -1.023774 | -1.070812 | 1.240011 | -0.049162 | 0.715837 | 0.892463 | 1.297704 | 0.646979 | 0.573175 | -0.111192 | 0.201479 | -1.494716 | -0.130802 | -1.669013 | -0.889611 | -0.082134 | 0.213913 | -2.494858 | -2.035470 | -1.904426 | 1.504222 | 0.345942 | 2.388901 | -0.866658 | -0.064749 | -0.024913 | 0.773681 | -1.621691 | 0.757731 | 0.055057 | 1.276967 | 0.094912 | 0.860022 | 2.257381 | -1.364420 | -2.137879 | -0.516183 | -1.453018 | -0.971501 | 1.405936 | -0.831259 | 0.267523 | 0.163212 | -0.139027 | -1.432488 | 0.704711 | 0.580194 | -1.410534 | 1.777199 | -0.827271 | 0.609743 | 0.471779 | 1.215756 | -2.679588 | -1.708573 | 0.240732 | -0.139863 | -1.323562 | 0.550541 | -0.968963 | -0.635755 | 0.584962 | 1.701143 | -0.016233 | 1.985353 | -0.727914 | 0.935947 | -0.293864 | -0.026265 | 0.331439 | 0.117150 | 1.402984 | -1.111802 | 0.062067 | 0.887994 | -1.544741 | 0.243238 | 0.159116 | 0.822140 | -0.508502 | 0.587452 | 0.406320 | -0.051080 | -0.950527 | -0.382103 | -1.526060 | -0.004148 | -1.115621 | -0.086422 | -0.255757 | -0.405421 | -0.191951 | -0.162721 | 0.876248 | 0.020705 | -0.075114 | 2.286112 | -1.163701 | -0.475991 | -1.290004 | -0.742421 | -0.377564 | -1.389678 | -0.341783 | 0.063005 | 0.875038 | -0.131383 | -0.308237 | 0.475783 | -0.736628 | 0.250871 | -0.279125 | 1.026914 | 1.350795 | 0.420081 | 1.861338 | -0.246536 | 1.081055 | -0.194379 | 1.499563 | -0.742339 | -0.932560 | 1.845952 | 0.235610 | -0.953649 | -0.482430 | -0.113230 | -0.253943 | -0.772005 | 0.023144 | 0.260280 | -0.105943 | 0.281661 | 0.399719 | -0.068570 | 0.589734 | -0.785809 | -0.824771 | -0.580445 | 0.421972 | 1.363296 | -0.833936 | -0.471769 | 0.388787 | -0.958010 | 1.358235 | -0.059645 | -0.875779 | 0.096645 | 1.710716 | -0.477707 | -1.103789 | 2.330919 | 0.187208 | 1.488413 | -0.899070 | 0.036299 | -0.157392 | 0.036406 | -0.750522 | -1.156699 | -0.272428 | -1.034019 | -0.573294 | 1.485576 | 0.186744 | -0.809559 | -0.744957 | 1.060748 | 1.056685 | -0.582912 | -0.374845 | 0.365666 | 0.870870 | -0.920520 | -0.159225 | -1.350794 | 1.189341 | 0.604135 | 0.899638 | 0.168262 | 2.284126 | -1.600290 | 1.572924 | 1.338468 | -0.040236 | -0.716137 | 1 |
| 7 | 0.795746 | 0.037138 | 1.840620 | -1.014149 | -1.523386 | 0.706830 | 1.225708 | 0.068772 | -1.796230 | 1.758523 | -0.490802 | -0.427326 | 0.386885 | 0.250879 | 0.607044 | 0.037526 | -0.289225 | 0.792095 | -0.619267 | 1.062605 | 0.687469 | -1.366061 | -0.224417 | -0.322538 | -0.119994 | -0.678069 | 0.510966 | -0.696002 | 0.762485 | -1.454293 | 1.276967 | 0.094912 | 0.842400 | 2.397919 | -1.857540 | -2.354731 | 0.224759 | -2.122281 | -0.971501 | -0.711481 | -0.970031 | 1.181412 | 0.423774 | 0.026932 | -1.238613 | 1.775004 | 0.166554 | -2.483087 | 0.801185 | -1.644623 | 0.758994 | 0.337467 | 0.902840 | 0.851540 | -1.045042 | -1.420179 | -0.919067 | 0.196065 | 1.786530 | -0.225958 | 1.132710 | -0.136035 | 1.517810 | 1.489914 | -0.963799 | -0.727914 | 0.935513 | 0.062594 | 0.712680 | -0.393556 | 0.591263 | 0.774148 | -1.254221 | 0.122151 | -0.200116 | -1.865234 | -0.415148 | 1.427532 | 0.499440 | 0.515214 | -0.766030 | 0.693535 | 1.515613 | -0.180165 | -0.670489 | 0.736356 | -0.652399 | -0.105731 | 0.196553 | -0.373975 | -0.184184 | 0.883371 | 0.962445 | 1.113267 | -0.169974 | -1.271600 | -0.480209 | -0.677965 | -0.574464 | -1.022344 | -0.242426 | -1.196799 | -0.061703 | 0.851356 | -0.294986 | 1.463430 | 1.516631 | 1.484668 | 0.846679 | -0.726060 | -1.634425 | -0.715749 | 0.947389 | 1.608791 | -1.145965 | -1.467499 | -0.246536 | -1.058467 | -0.825399 | 0.046764 | -0.486858 | -0.016834 | -0.246612 | -0.129856 | -0.766655 | 0.702453 | -0.094161 | -0.928537 | -0.067922 | 0.527027 | 1.405269 | -0.994404 | 0.188632 | 0.622553 | 2.886290 | 1.518066 | -0.137820 | 0.269454 | -0.580445 | 2.924429 | 0.598023 | 0.769398 | -0.620304 | 0.334299 | -1.224478 | 1.107995 | -0.787939 | -0.875779 | 0.276577 | -0.947360 | 2.699720 | -0.579399 | -0.373233 | -0.592414 | -1.816919 | -0.936310 | 2.031709 | -0.157392 | 0.211063 | -0.596323 | 2.164224 | -1.022856 | -1.034019 | -0.361290 | -1.366065 | 1.080257 | -0.809559 | -0.744957 | 1.614725 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -0.471941 | 0.294856 | 0.527342 | 0.335880 | 0.379703 | -0.121807 | 0.310391 | 0.024142 | 1.088963 | -0.519461 | -1.032442 | -0.016152 | 0.614713 | 0.408189 | 1 |
| 8 | -0.677576 | 1.721635 | 1.840620 | -1.014149 | -1.523386 | 0.990821 | 0.894526 | -0.810584 | -1.654935 | 1.385175 | 0.599515 | -0.376325 | 0.386885 | 1.645800 | 0.322962 | 1.239982 | 0.445818 | 0.450554 | -1.913996 | -0.592164 | 1.326725 | 0.350516 | 2.317848 | -0.424897 | -0.064749 | 0.332195 | 0.557489 | -1.597713 | 0.817067 | -0.028472 | 1.276967 | 0.094912 | 0.535105 | 2.116279 | -1.061592 | -1.796170 | -0.664773 | -1.250098 | -0.971501 | -0.167003 | -0.746104 | -1.916637 | 0.482684 | -0.263495 | -1.624161 | -0.598254 | 0.900431 | 0.363404 | 0.801185 | 0.034026 | -0.328406 | -0.429928 | -1.528664 | -0.569492 | -0.056989 | 0.061174 | -1.753929 | 0.196065 | 1.786530 | 1.867115 | 0.425324 | -0.008679 | 1.847809 | -0.449667 | -1.354326 | -0.727914 | 0.932046 | -0.043380 | -0.014677 | 0.186440 | 0.267574 | 0.998732 | -1.069077 | 0.245595 | -0.200116 | -1.865234 | -0.415148 | 1.427532 | 0.499440 | 0.515214 | -0.766030 | -0.030026 | -0.098962 | -0.180165 | -0.670489 | 0.736356 | -0.652399 | -0.366143 | -0.275072 | -0.752270 | 0.304050 | 1.362900 | 0.774917 | 1.113267 | 0.273226 | -0.075114 | -0.028965 | 0.434045 | -0.534793 | -0.353193 | -1.492413 | -1.246450 | -1.551873 | -1.119170 | 0.063005 | 1.618044 | -0.597430 | 0.771738 | 0.042392 | -0.662651 | -2.104118 | -0.877643 | 0.223441 | 0.811828 | 0.978767 | 1.818003 | -0.246536 | -0.195644 | -0.701866 | 0.694420 | -1.777947 | 0.155694 | -0.656545 | 0.182863 | -1.753347 | 0.041724 | 0.406391 | -0.592952 | -0.764573 | 0.527027 | -0.920914 | -0.299086 | -0.586859 | -0.572754 | 3.160250 | 1.353029 | 1.374156 | -0.235573 | -0.580445 | 0.944786 | -1.145950 | -0.833936 | 0.875651 | 0.235540 | -1.224478 | 1.107995 | -0.787939 | -0.875779 | 0.615018 | 0.969976 | -0.671742 | 1.165146 | -0.136290 | -1.163093 | 0.981314 | -0.922451 | 1.087327 | -0.157392 | 0.574125 | -0.450969 | -0.803623 | -0.272428 | 1.112110 | -0.540566 | 0.069059 | -0.445253 | -0.809559 | 1.380042 | -0.082742 | 0.367067 | -0.582912 | -0.374845 | 0.365666 | -0.361787 | -0.295666 | -0.961766 | 0.350184 | 0.111715 | 0.999413 | 0.388126 | -0.924647 | 0.283527 | -0.241533 | 1.039294 | -0.929047 | -0.451482 | 0.774453 | 1 |
| 10 | -0.251573 | 0.851067 | -0.238748 | -0.889267 | 0.075629 | 0.694362 | -1.272553 | 0.141048 | -2.317934 | 1.077456 | -0.069394 | 1.049776 | -0.560103 | 1.023335 | 0.322962 | 2.580719 | 0.304053 | 0.450554 | -0.213269 | -1.105669 | 1.276011 | -0.397495 | -0.423080 | 1.222153 | -0.064749 | 0.060201 | 1.726755 | 1.814780 | -2.193923 | -1.180442 | 1.634265 | 0.094912 | 1.657748 | -1.362616 | -1.305984 | -2.018940 | -1.053316 | 0.647876 | 1.014438 | 0.509471 | -0.768181 | 0.313394 | -0.600349 | 0.003223 | -1.747537 | 1.840152 | 1.067221 | 0.073535 | 0.489691 | 1.478308 | 0.556439 | -1.597278 | -0.990040 | -0.963170 | 0.537002 | -1.420179 | -1.086040 | 0.196065 | 0.550541 | 0.362353 | 0.779017 | -0.181225 | 1.554477 | -0.135101 | -0.493820 | -0.727914 | 0.917309 | -0.746662 | 0.036880 | 0.186440 | -0.026030 | 0.250118 | 1.266594 | -1.170193 | -0.526549 | -0.826884 | 1.580912 | -0.587076 | 0.396026 | -0.421323 | -0.850623 | 0.608843 | 0.464128 | -0.456127 | 0.142453 | -0.267880 | -0.374577 | 0.846994 | -0.520318 | 0.051608 | 1.217362 | -0.237745 | 1.441682 | 0.955254 | 0.603048 | -0.075114 | -0.028965 | 0.436680 | -0.583186 | -0.620854 | -0.492424 | -0.129311 | -1.470776 | 2.533465 | -1.803485 | -0.102754 | -0.007027 | 1.779040 | 0.363020 | 0.394161 | -0.907053 | -1.059161 | 0.703331 | -0.495535 | -0.850444 | -0.416342 | 0.026646 | 0.019478 | -1.042416 | 1.731017 | -0.746901 | -0.030106 | 0.611387 | 0.435298 | -1.220215 | 0.628628 | 0.077457 | 1.362343 | -1.284369 | 0.527027 | 1.622500 | 2.173153 | -1.571247 | 0.330736 | -0.694766 | -0.689302 | 0.150175 | -0.488087 | -0.580445 | 2.554347 | 0.837643 | 0.448731 | -0.206529 | -0.176524 | -0.701534 | 0.044411 | -0.076075 | 1.641084 | -0.176764 | -0.947360 | 0.299568 | 2.359890 | 0.213211 | 0.004329 | 0.266006 | -0.944540 | 1.886554 | 0.931728 | -1.116932 | -0.508032 | -1.156699 | 0.068910 | -1.034019 | -0.494353 | -1.366065 | 0.294203 | 0.651618 | 1.158549 | 2.040922 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -0.195940 | -0.034738 | 2.063871 | 0.462487 | 0.223731 | -0.221897 | -1.269731 | 0.564531 | 2.396714 | -2.341430 | 0.599835 | 1.466077 | 0.782258 | -0.411947 | 1 |
| 11 | -1.305030 | -0.295197 | -0.196644 | 0.787541 | -0.899710 | 0.065423 | 1.319303 | -1.870629 | -2.187508 | 0.055000 | 0.872225 | 0.615534 | 0.386885 | 1.135829 | -0.979410 | -0.031442 | 0.515958 | -2.423074 | -1.348142 | -0.791661 | 1.526909 | 0.350516 | 2.317848 | -0.424897 | -0.064749 | 0.332195 | 0.855194 | -1.800897 | 0.975184 | 0.079905 | 1.276967 | 0.094912 | 1.248753 | 2.000396 | -0.920018 | -1.304138 | -0.502127 | -1.787665 | -0.971501 | -0.909473 | -2.083356 | 0.482763 | 0.518936 | 0.222525 | 0.444581 | -0.691323 | 0.623560 | -1.636567 | -0.237128 | -1.287214 | -1.447789 | -0.014610 | 0.692520 | -0.076390 | -0.071476 | -2.183300 | 0.861970 | 0.196065 | -0.685447 | 0.013117 | 0.071631 | 0.040620 | 1.444477 | -0.016233 | 0.202395 | -0.727914 | 0.968889 | -0.948976 | -0.570470 | -0.828552 | 1.248833 | -0.034356 | -1.026351 | -0.097427 | 0.887994 | -1.402825 | 0.290266 | -0.587076 | 0.717480 | 2.304036 | 0.587452 | -0.459007 | 0.546485 | -0.617087 | -0.668174 | -0.429481 | 0.736710 | 0.116571 | 2.334591 | -1.154209 | -0.666745 | 0.087319 | 1.108300 | 1.113267 | -0.664708 | -0.075114 | 0.750902 | -1.175998 | -0.761280 | -0.487023 | -0.742421 | -1.618830 | -1.607628 | -1.119170 | 0.063005 | 1.618044 | -0.597430 | 0.771738 | 0.042392 | -0.662651 | -1.256061 | -0.647066 | 0.780113 | 1.454023 | 0.111617 | 1.405453 | -0.246536 | -0.789565 | -1.706823 | 0.049191 | 2.565222 | 1.111234 | 1.493219 | 1.109714 | 0.693491 | -1.563959 | -2.182179 | -0.250519 | 0.250340 | -1.992389 | 0.577076 | 0.434860 | -0.467304 | -0.572257 | -0.068570 | -0.132303 | -0.497814 | 2.289562 | -0.580445 | 1.837681 | -1.350979 | -1.154603 | 0.048101 | 0.467965 | -1.161192 | 1.064915 | 0.438640 | -0.875779 | 0.869232 | -0.219690 | -0.901517 | -1.115397 | -0.875037 | -0.387987 | 0.639047 | -0.977465 | 0.219717 | -0.157392 | 1.059078 | -0.727031 | -1.156699 | -1.022856 | -1.034019 | -0.899540 | -0.808767 | 0.105296 | 2.073032 | -0.744957 | 1.018369 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 0.928824 | 2.083586 | 0.606205 | 0.198621 | -0.298067 | 0.416107 | 0.241773 | -0.534322 | 2.526623 | -0.426818 | -0.027964 | 1.613318 | 1.909378 | -0.022017 | 1 |
| 12 | -1.446708 | 0.171238 | -0.238748 | -0.889267 | 0.075629 | 0.924326 | -1.769325 | 2.586380 | -0.557183 | -0.177751 | 0.352703 | -0.871526 | -1.312958 | -1.579018 | 0.322962 | -0.186620 | -0.935884 | 0.614524 | -1.747752 | 1.148104 | 1.635009 | -0.226139 | -0.366522 | -0.526277 | -0.530875 | 0.060201 | 1.220248 | -1.980735 | 0.099276 | 0.261238 | -0.022297 | 0.094912 | 1.684584 | 0.551019 | -0.535076 | -0.665341 | -0.185872 | -0.960609 | 1.099340 | -0.513489 | -1.392653 | -1.990736 | -0.532376 | 0.252161 | -1.348769 | 0.630256 | 0.740313 | 0.201407 | -0.174830 | -1.099721 | -2.012811 | 1.683369 | -0.846407 | -2.622928 | -2.064968 | -0.208163 | -0.139863 | -0.563748 | 1.786530 | -0.378309 | -1.343141 | 1.437422 | 0.857812 | -0.016233 | 2.666756 | -0.727914 | 0.920343 | 1.247575 | -1.925228 | 0.476438 | 0.195739 | 1.163427 | 1.266594 | -0.142217 | 0.343939 | -1.504532 | 0.676937 | -0.437869 | 0.158050 | -0.884326 | 0.756637 | -1.355634 | -0.958920 | -0.456127 | 0.142453 | -0.267880 | -0.374577 | 0.869224 | 0.963732 | 0.398379 | 0.419018 | -0.376550 | 1.504191 | 0.481216 | -1.535646 | 0.980967 | -0.696021 | -0.674452 | 0.281965 | -1.290004 | -0.742421 | -0.054835 | 0.592148 | 1.617918 | -1.835581 | -0.851487 | -1.589957 | 0.757939 | 0.017937 | 1.070521 | -0.936409 | 0.446946 | 1.120150 | -0.132345 | 0.489108 | 1.299545 | -2.158809 | 0.187834 | -0.027443 | 1.591159 | -0.888328 | -1.264344 | -0.413445 | 1.686171 | 0.721341 | -0.652227 | -0.132299 | 1.807506 | -0.344924 | 2.441783 | 1.391692 | 1.361949 | -1.470397 | 1.415617 | -0.068570 | 1.353029 | 1.734150 | 1.111166 | -0.580445 | 1.009403 | 1.401768 | -0.833936 | 0.727116 | 0.232135 | -1.217816 | 1.272606 | -0.893827 | 3.355237 | -0.108550 | -0.149862 | 0.599433 | -0.603253 | 0.124166 | -0.968409 | 0.331845 | -0.857095 | 0.605447 | 0.881600 | 0.324563 | -0.957279 | -0.761238 | 0.744486 | -1.034019 | -0.581786 | 1.286191 | 1.160207 | -0.809559 | -0.744957 | 2.452365 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 2.035091 | -0.034738 | 2.102529 | 0.211403 | -1.511814 | -1.033178 | -0.801884 | 0.792780 | 0.950393 | -1.569409 | 0.442885 | -0.173209 | -0.847498 | -0.535529 | 1 |
| 14 | -0.737510 | 2.204895 | 0.901897 | -1.184030 | -0.008529 | -0.233807 | 1.132113 | -1.449020 | -0.698478 | 0.295974 | 1.164467 | -0.187133 | -0.687946 | -0.221595 | 0.322962 | 0.036146 | 0.180397 | 0.450554 | -2.000311 | 0.121094 | 0.707487 | -0.023434 | 1.116489 | 1.041433 | -1.335374 | -2.019537 | -0.199795 | -1.065773 | 0.223319 | -1.481255 | -1.256597 | 0.094912 | -0.328766 | -0.104216 | -0.728826 | 0.628529 | 0.404472 | -1.031013 | 1.016717 | -0.513489 | -1.042570 | 0.443949 | -0.913024 | 0.015078 | 1.832555 | 0.248673 | 0.780342 | -0.218943 | 0.178197 | 1.982196 | -0.040565 | 0.620411 | 0.159026 | -0.467493 | -0.216352 | 1.587417 | -0.919067 | -0.563748 | -0.685447 | -0.968963 | 1.132710 | -0.616699 | 2.104475 | -0.540993 | 0.434018 | -0.727914 | 0.898237 | -2.211029 | -0.317092 | 0.041441 | -2.452347 | -0.393691 | -1.026351 | 1.090043 | -0.308927 | -0.956973 | -0.796595 | 0.308323 | 0.247759 | -0.233742 | -1.654253 | 1.315834 | 1.454324 | 0.808645 | -0.386708 | 0.984529 | 0.088459 | -1.045754 | 2.542106 | -0.421262 | 0.426203 | -0.227068 | 0.545717 | 1.113267 | -0.072058 | -0.075114 | -0.028965 | 0.495531 | -0.653242 | -0.620854 | -0.492424 | -0.327913 | -1.369403 | -0.557618 | 1.020938 | 0.717560 | 0.159169 | 0.426771 | -0.653208 | 0.108822 | -0.127493 | 0.898288 | -0.796669 | -0.550096 | -0.787889 | -0.510037 | 1.775010 | 0.042861 | -1.095836 | -0.211608 | 2.533287 | 0.520657 | 2.923217 | 1.290564 | 0.554240 | -1.047188 | -0.904579 | 0.735690 | -0.353691 | -0.480740 | 1.264974 | -0.299086 | 1.127051 | -0.028328 | -0.675197 | -0.792450 | 0.294173 | -0.319744 | -0.580445 | 0.034267 | 1.104618 | 0.128064 | -0.089824 | -0.754606 | -1.450976 | 2.498806 | 0.538718 | 0.596826 | 1.299165 | 0.236883 | 0.143962 | -0.216823 | 1.237636 | -0.500700 | -0.327908 | -0.905275 | 0.755421 | -0.184360 | 2.080267 | -0.513625 | -1.156699 | 0.074306 | -0.503939 | -0.558274 | 0.203292 | 0.435457 | 0.251655 | 1.414997 | -0.082742 | 1.380837 | -0.582912 | -0.374845 | 0.365666 | 0.441762 | -2.053497 | -1.018464 | 0.211403 | -1.511814 | -0.031634 | 0.694265 | 1.675514 | 1.071642 | 2.352458 | -1.409121 | -0.035784 | 2.061691 | 1.463184 | 1 |
| 15 | -0.344301 | 0.818583 | 0.901897 | -1.184030 | -0.008529 | 1.139051 | 1.895271 | -0.304653 | -0.328938 | 1.293168 | 1.447275 | -0.650519 | -0.624814 | 0.220880 | 0.322962 | 2.474508 | 0.153098 | 0.450554 | -2.364762 | -0.335412 | 0.989080 | -0.023434 | 1.116489 | 1.041433 | -1.335374 | -2.019537 | -0.467660 | -1.357930 | 0.628920 | -1.450064 | -0.509520 | 0.094912 | -0.549959 | 0.832062 | -1.765415 | -0.093801 | -0.718988 | 0.648455 | 1.122547 | -0.265999 | -0.862798 | 0.228709 | -0.888100 | 0.015078 | 2.251151 | 0.397583 | 1.037199 | 0.163065 | -1.628468 | 1.982196 | 0.801637 | -0.665338 | 0.159026 | 0.258080 | 0.467462 | 2.170981 | -0.306836 | -0.563748 | -0.685447 | 0.535799 | 2.901175 | 0.139218 | 2.471141 | -0.238024 | -0.287784 | -0.727914 | 0.953285 | -0.688858 | 0.335467 | 0.041441 | -1.704645 | 0.280062 | -1.211496 | 0.890129 | 0.561561 | -0.690881 | 0.238013 | 0.233720 | -0.020120 | 0.073704 | -0.596845 | 0.119105 | 0.077243 | 0.808645 | -0.386708 | 0.984529 | 0.088459 | -0.445537 | -0.048692 | -1.138447 | 0.262072 | 0.577525 | -0.183558 | 0.244197 | -0.654401 | -0.075114 | -0.028965 | 0.663299 | -0.726955 | 0.315957 | -0.742421 | 0.069292 | -1.526530 | -0.557618 | 1.020938 | 0.717560 | 0.159169 | 0.426771 | -0.653208 | 0.108822 | -1.422410 | -0.818772 | -0.903616 | -0.366310 | -0.964770 | -0.612880 | -0.100839 | -1.579903 | -1.366273 | -0.319506 | -0.085389 | -0.016834 | 2.327384 | 0.186630 | 0.355310 | -1.700534 | -0.098928 | 0.194645 | -0.875202 | 0.023144 | 1.264974 | -0.839889 | 0.594652 | 1.034718 | 0.107547 | -0.070414 | 2.526137 | -0.993114 | -0.580445 | 3.294510 | 0.976322 | 0.128064 | 0.164807 | -0.975963 | -1.157861 | 1.685062 | -1.405336 | 0.577158 | 0.404365 | -0.947360 | -0.662026 | 2.676662 | 0.828150 | -0.559715 | -0.621216 | -0.941628 | 0.859906 | 0.487374 | -1.116932 | 0.360996 | -1.156699 | -0.271233 | -0.588571 | -0.727765 | 0.844124 | -0.191212 | 0.227137 | -0.744957 | -0.082742 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -0.196556 | -1.953933 | -1.066916 | 0.211403 | -1.511814 | 1.377339 | 0.843976 | -0.059928 | 0.829145 | 1.765722 | -1.911361 | -0.310634 | 0.310085 | 0.523732 | 1 |
#fitting naive bayes with threshold
# store the predicted probabilities for failed class
y_pred_prob = nb.predict_proba(val4)[:, 1]
# predict fail if the predicted probability is greater than 0.1555
pred = binarize([y_pred_prob], 0.01)[0]
val4['Pass/Fail'] = pred
val4 = val4[(val4['Pass/Fail'] == 1)]
val4.head(19)
| 0 | 1 | 2 | 3 | 4 | 8 | 9 | 10 | 11 | 14 | 15 | 16 | 19 | 20 | 21 | 23 | 24 | 25 | 28 | 29 | 31 | 32 | 33 | 36 | 40 | 41 | 43 | 44 | 45 | 47 | 53 | 59 | 60 | 62 | 63 | 64 | 67 | 71 | 72 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 99 | 100 | 102 | 103 | 107 | 108 | 109 | 112 | 113 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 126 | 129 | 130 | 134 | 135 | 136 | 137 | 138 | 139 | 142 | 143 | 144 | 145 | 146 | 150 | 151 | 153 | 155 | 156 | 159 | 160 | 161 | 162 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 175 | 176 | 177 | 180 | 181 | 182 | 183 | 184 | 188 | 195 | 200 | 201 | 208 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 221 | 222 | 223 | 225 | 227 | 228 | 238 | 239 | 244 | 247 | 250 | 251 | 253 | 255 | 267 | 268 | 269 | 345 | 418 | 419 | 423 | 432 | 433 | 438 | 460 | 468 | 472 | 476 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 499 | 500 | 510 | 511 | 542 | 543 | 544 | 546 | 558 | 559 | 562 | 565 | 570 | 571 | 572 | 578 | 582 | 583 | 586 | 587 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2 | -1.244126 | 1.046470 | -0.575570 | 1.063986 | 0.646105 | -0.286450 | 0.390554 | 0.141048 | -0.339807 | 0.206534 | 0.493627 | -1.323982 | 0.609427 | 0.483365 | 0.489006 | -0.275934 | -0.926830 | 0.672706 | -0.667230 | -0.107159 | 2.553189 | -0.397495 | -0.423080 | 1.222153 | -0.064749 | 0.060201 | 1.438211 | -1.003934 | -0.646612 | 0.642937 | -0.022297 | 0.094912 | 1.597074 | -0.151153 | -0.467066 | -0.977303 | -1.051308 | -0.727803 | 0.872552 | -0.112005 | -0.124787 | 0.119324 | 0.813485 | -0.304985 | -2.051569 | 0.043921 | 0.837051 | -0.701521 | 0.635055 | 1.478308 | 0.524457 | -2.552152 | -1.872358 | 1.417009 | 0.729687 | 0.106064 | -0.640780 | 0.196065 | 1.786530 | 0.013117 | 2.193789 | 0.315872 | 2.654474 | -0.452566 | -0.777962 | -0.727914 | 0.950250 | -2.182127 | -0.747281 | -0.828552 | -0.097988 | 0.414813 | 0.882063 | 1.527015 | 1.105616 | -1.141464 | 1.931007 | 0.009909 | 0.911848 | 1.105026 | -0.850623 | -0.208615 | -0.487763 | -1.375939 | -0.321518 | 1.180759 | -0.281970 | -0.972712 | -0.702680 | 0.855486 | -0.613421 | -0.661040 | 1.316664 | 1.034260 | -0.051444 | -0.267407 | 1.687724 | -0.972218 | -0.210678 | -0.754684 | 0.507566 | 0.044466 | -0.396231 | 2.533465 | -1.803485 | -0.102754 | -0.007027 | 1.779040 | 0.363020 | 0.394161 | -0.375386 | 1.011123 | 0.733495 | -0.557049 | 1.298016 | 0.802919 | -2.158809 | -0.099774 | -0.701866 | 0.273707 | -2.138357 | -0.249083 | 1.116653 | -0.548069 | -1.323658 | 0.152460 | -2.277522 | 0.235737 | -1.203946 | -1.589283 | 1.622500 | -0.299086 | 1.897463 | -0.326593 | 0.440213 | 0.177141 | 1.086160 | -0.403915 | -0.580445 | 0.034267 | -0.033678 | -0.833936 | -0.068604 | 0.118051 | -0.704865 | 1.342013 | 0.221454 | 3.462820 | -1.116340 | -0.947360 | -0.930111 | -0.915098 | -0.372807 | 0.431055 | 0.467200 | -0.909455 | 0.267181 | -0.486848 | 0.422524 | -0.061948 | -1.156699 | 0.258727 | 1.564662 | -0.591039 | -0.610179 | 0.060292 | -0.809559 | -0.744957 | -0.082742 | -0.091869 | -0.582912 | -0.374845 | 0.365666 | -0.108597 | -0.034738 | 0.262406 | 0.636267 | -0.231424 | 0.935491 | -0.512539 | 1.549409 | -0.677800 | -1.322362 | 0.568445 | -0.006336 | -0.085930 | 0.139999 | 1.0 |
| 3 | -0.337678 | -0.286868 | -0.087521 | -1.297836 | 0.075629 | 0.331406 | -0.797380 | -0.413067 | -0.187643 | 0.239833 | 1.155211 | -0.405954 | -0.372284 | -0.356587 | 0.442694 | -0.921474 | -0.786686 | 0.684041 | -1.875643 | 0.691598 | -0.346818 | -0.226139 | -0.366522 | -0.526277 | -0.530875 | 0.060201 | 1.218770 | -0.674548 | -0.646612 | 0.661441 | -0.022297 | 0.094912 | 1.068895 | -0.061683 | 0.082907 | -1.121265 | 0.019945 | -1.460161 | 1.129752 | -2.735402 | -0.550563 | -1.108604 | 1.237181 | 0.015078 | -1.868709 | 1.998369 | 0.803693 | 0.278892 | -0.610921 | -0.068510 | 0.780316 | -1.187370 | 0.492460 | -1.728648 | 0.938308 | -0.297942 | -1.086040 | -0.563748 | 1.786530 | -0.443937 | -0.635755 | -1.395211 | 1.444477 | -0.716396 | -1.386646 | -0.727914 | 0.934213 | 1.025993 | -0.979166 | 0.186440 | 0.126359 | 0.429785 | 0.725402 | 1.972726 | -2.702771 | -0.845806 | -1.277321 | 0.755944 | -0.271801 | 0.732362 | 0.756637 | -0.363269 | 0.033192 | -1.065506 | -1.215233 | -0.446796 | -1.022828 | 0.253128 | -1.268631 | 0.808199 | 0.995368 | 0.951942 | 0.754081 | 0.086184 | 0.989559 | 0.639114 | -0.436065 | 0.202157 | 0.043662 | -1.290004 | -0.742421 | -0.874070 | -0.791582 | 1.617918 | -1.835581 | -0.851487 | -1.589957 | 0.757939 | 0.017937 | 1.070521 | -0.884220 | 0.756017 | 1.106438 | -0.609948 | 0.924840 | 0.447425 | -2.158809 | -0.841008 | -0.424752 | 1.234101 | 2.008640 | -0.454790 | -0.246612 | -0.743989 | -0.102229 | -0.164985 | -0.875976 | 0.968544 | 1.762562 | -0.682293 | 0.721896 | -0.608116 | 0.976054 | -0.374239 | -0.342531 | -0.256080 | 1.446155 | -0.403915 | -0.580445 | 0.269240 | -0.301016 | 0.128064 | 0.992357 | 0.157214 | -0.618263 | 1.635732 | -0.497223 | 1.811412 | -0.008122 | -0.947360 | 0.817510 | 0.115612 | 0.024719 | -0.861680 | 0.010876 | -0.894270 | -0.822562 | 2.319415 | -0.319258 | -0.499437 | 0.940724 | -0.426749 | -0.669863 | -0.600683 | -1.366065 | 0.523571 | -0.809559 | 1.457615 | -0.082742 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 0.653439 | -0.034738 | 2.102529 | 0.813396 | 0.799410 | -0.875424 | -0.531733 | 0.420471 | 1.071642 | -0.334176 | -1.126612 | -0.035784 | -0.070698 | -0.058696 | 1.0 |
| 4 | 0.365381 | 0.095775 | 1.240011 | -0.049162 | 0.715837 | 0.537820 | -0.127817 | -0.882859 | -0.839773 | 0.597331 | 0.954089 | 1.163679 | -0.446464 | -1.279035 | 0.424621 | -0.193862 | 0.172424 | 0.495135 | -1.904415 | 2.004117 | -1.794820 | 1.354001 | -0.114143 | 0.756885 | -0.706967 | -1.532908 | -0.354845 | 2.345457 | -0.024164 | -0.758564 | 0.887188 | 0.094912 | -0.025366 | -0.061248 | 2.133553 | 0.009330 | -0.487068 | -1.681150 | -0.971501 | -2.619906 | 1.480546 | -0.286456 | -0.686448 | 0.216598 | 0.054627 | 0.593028 | -0.280443 | 0.096620 | 0.219729 | 1.472449 | 0.758994 | 0.384305 | -0.169279 | 0.560286 | 0.577567 | -0.028605 | 0.027109 | -0.563748 | -1.921436 | -0.378309 | 1.486403 | 1.470288 | 1.957809 | -1.174473 | -2.542067 | -0.727914 | -1.080001 | -0.322766 | 1.158847 | 0.476438 | -0.214644 | -1.112361 | -1.168770 | 0.460804 | -0.526549 | -0.826884 | 1.580912 | -0.587076 | 0.396026 | -0.421323 | 0.418267 | 0.100694 | 0.021700 | -0.950527 | -0.382103 | -1.526060 | -0.004148 | -1.353802 | -0.903907 | 0.887011 | 0.969652 | 0.303000 | 0.254007 | 0.718235 | 0.159849 | 0.150753 | -0.789213 | -0.538305 | -0.395525 | 1.654258 | -0.992418 | -0.501691 | 0.516119 | -0.690684 | -1.156633 | -0.228736 | 0.082463 | 0.498524 | -1.536294 | 0.108822 | -1.203873 | -1.260302 | 0.058907 | 1.680280 | -0.362942 | 1.370606 | -0.137263 | 0.611061 | -0.558301 | 0.754776 | 0.489443 | -0.169455 | 1.464619 | 1.987586 | 0.251867 | 0.916544 | 0.463597 | 1.252764 | 0.422812 | 0.023144 | 0.893871 | -1.033032 | -1.537157 | 1.079880 | 1.125115 | 2.281361 | 1.446155 | -1.161456 | -0.580445 | -0.852754 | 1.128455 | -1.154603 | -1.437244 | 0.020143 | -0.701534 | 0.044411 | -0.076075 | -0.875779 | -1.116340 | 1.713392 | 0.449219 | -0.464217 | -0.980412 | -1.293888 | 0.324069 | -0.787018 | -1.050582 | -0.157392 | -0.509155 | 1.005489 | 2.011641 | 0.593895 | -1.034019 | 1.546752 | 1.583156 | -1.278179 | 0.093448 | -0.744957 | -0.082742 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 1.267095 | 0.140359 | -0.542713 | 0.211403 | -1.511814 | -0.287282 | -0.521176 | 0.120222 | -0.677800 | 0.005514 | -0.027964 | -0.035784 | -0.070698 | -0.058696 | 1.0 |
| 5 | -1.023774 | -1.070812 | 1.240011 | -0.049162 | 0.715837 | 0.892463 | 1.297704 | 0.646979 | 0.573175 | -0.111192 | 0.201479 | -1.494716 | -0.130802 | -1.669013 | -0.889611 | -0.082134 | 0.213913 | -2.494858 | -2.035470 | -1.904426 | 1.504222 | 0.345942 | 2.388901 | -0.866658 | -0.064749 | -0.024913 | 0.773681 | -1.621691 | 0.757731 | 0.055057 | 1.276967 | 0.094912 | 0.860022 | 2.257381 | -1.364420 | -2.137879 | -0.516183 | -1.453018 | -0.971501 | 1.405936 | -0.831259 | 0.267523 | 0.163212 | -0.139027 | -1.432488 | 0.704711 | 0.580194 | -1.410534 | 1.777199 | -0.827271 | 0.609743 | 0.471779 | 1.215756 | -2.679588 | -1.708573 | 0.240732 | -0.139863 | -1.323562 | 0.550541 | -0.968963 | -0.635755 | 0.584962 | 1.701143 | -0.016233 | 1.985353 | -0.727914 | 0.935947 | -0.293864 | -0.026265 | 0.331439 | 0.117150 | 1.402984 | -1.111802 | 0.062067 | 0.887994 | -1.544741 | 0.243238 | 0.159116 | 0.822140 | -0.508502 | 0.587452 | 0.406320 | -0.051080 | -0.950527 | -0.382103 | -1.526060 | -0.004148 | -1.115621 | -0.086422 | -0.255757 | -0.405421 | -0.191951 | -0.162721 | 0.876248 | 0.020705 | -0.075114 | 2.286112 | -1.163701 | -0.475991 | -1.290004 | -0.742421 | -0.377564 | -1.389678 | -0.341783 | 0.063005 | 0.875038 | -0.131383 | -0.308237 | 0.475783 | -0.736628 | 0.250871 | -0.279125 | 1.026914 | 1.350795 | 0.420081 | 1.861338 | -0.246536 | 1.081055 | -0.194379 | 1.499563 | -0.742339 | -0.932560 | 1.845952 | 0.235610 | -0.953649 | -0.482430 | -0.113230 | -0.253943 | -0.772005 | 0.023144 | 0.260280 | -0.105943 | 0.281661 | 0.399719 | -0.068570 | 0.589734 | -0.785809 | -0.824771 | -0.580445 | 0.421972 | 1.363296 | -0.833936 | -0.471769 | 0.388787 | -0.958010 | 1.358235 | -0.059645 | -0.875779 | 0.096645 | 1.710716 | -0.477707 | -1.103789 | 2.330919 | 0.187208 | 1.488413 | -0.899070 | 0.036299 | -0.157392 | 0.036406 | -0.750522 | -1.156699 | -0.272428 | -1.034019 | -0.573294 | 1.485576 | 0.186744 | -0.809559 | -0.744957 | 1.060748 | 1.056685 | -0.582912 | -0.374845 | 0.365666 | 0.870870 | -0.920520 | -0.159225 | -1.350794 | 1.189341 | 0.604135 | 0.899638 | 0.168262 | 2.284126 | -1.600290 | 1.572924 | 1.338468 | -0.040236 | -0.716137 | 1.0 |
| 7 | 0.795746 | 0.037138 | 1.840620 | -1.014149 | -1.523386 | 0.706830 | 1.225708 | 0.068772 | -1.796230 | 1.758523 | -0.490802 | -0.427326 | 0.386885 | 0.250879 | 0.607044 | 0.037526 | -0.289225 | 0.792095 | -0.619267 | 1.062605 | 0.687469 | -1.366061 | -0.224417 | -0.322538 | -0.119994 | -0.678069 | 0.510966 | -0.696002 | 0.762485 | -1.454293 | 1.276967 | 0.094912 | 0.842400 | 2.397919 | -1.857540 | -2.354731 | 0.224759 | -2.122281 | -0.971501 | -0.711481 | -0.970031 | 1.181412 | 0.423774 | 0.026932 | -1.238613 | 1.775004 | 0.166554 | -2.483087 | 0.801185 | -1.644623 | 0.758994 | 0.337467 | 0.902840 | 0.851540 | -1.045042 | -1.420179 | -0.919067 | 0.196065 | 1.786530 | -0.225958 | 1.132710 | -0.136035 | 1.517810 | 1.489914 | -0.963799 | -0.727914 | 0.935513 | 0.062594 | 0.712680 | -0.393556 | 0.591263 | 0.774148 | -1.254221 | 0.122151 | -0.200116 | -1.865234 | -0.415148 | 1.427532 | 0.499440 | 0.515214 | -0.766030 | 0.693535 | 1.515613 | -0.180165 | -0.670489 | 0.736356 | -0.652399 | -0.105731 | 0.196553 | -0.373975 | -0.184184 | 0.883371 | 0.962445 | 1.113267 | -0.169974 | -1.271600 | -0.480209 | -0.677965 | -0.574464 | -1.022344 | -0.242426 | -1.196799 | -0.061703 | 0.851356 | -0.294986 | 1.463430 | 1.516631 | 1.484668 | 0.846679 | -0.726060 | -1.634425 | -0.715749 | 0.947389 | 1.608791 | -1.145965 | -1.467499 | -0.246536 | -1.058467 | -0.825399 | 0.046764 | -0.486858 | -0.016834 | -0.246612 | -0.129856 | -0.766655 | 0.702453 | -0.094161 | -0.928537 | -0.067922 | 0.527027 | 1.405269 | -0.994404 | 0.188632 | 0.622553 | 2.886290 | 1.518066 | -0.137820 | 0.269454 | -0.580445 | 2.924429 | 0.598023 | 0.769398 | -0.620304 | 0.334299 | -1.224478 | 1.107995 | -0.787939 | -0.875779 | 0.276577 | -0.947360 | 2.699720 | -0.579399 | -0.373233 | -0.592414 | -1.816919 | -0.936310 | 2.031709 | -0.157392 | 0.211063 | -0.596323 | 2.164224 | -1.022856 | -1.034019 | -0.361290 | -1.366065 | 1.080257 | -0.809559 | -0.744957 | 1.614725 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -0.471941 | 0.294856 | 0.527342 | 0.335880 | 0.379703 | -0.121807 | 0.310391 | 0.024142 | 1.088963 | -0.519461 | -1.032442 | -0.016152 | 0.614713 | 0.408189 | 1.0 |
| 8 | -0.677576 | 1.721635 | 1.840620 | -1.014149 | -1.523386 | 0.990821 | 0.894526 | -0.810584 | -1.654935 | 1.385175 | 0.599515 | -0.376325 | 0.386885 | 1.645800 | 0.322962 | 1.239982 | 0.445818 | 0.450554 | -1.913996 | -0.592164 | 1.326725 | 0.350516 | 2.317848 | -0.424897 | -0.064749 | 0.332195 | 0.557489 | -1.597713 | 0.817067 | -0.028472 | 1.276967 | 0.094912 | 0.535105 | 2.116279 | -1.061592 | -1.796170 | -0.664773 | -1.250098 | -0.971501 | -0.167003 | -0.746104 | -1.916637 | 0.482684 | -0.263495 | -1.624161 | -0.598254 | 0.900431 | 0.363404 | 0.801185 | 0.034026 | -0.328406 | -0.429928 | -1.528664 | -0.569492 | -0.056989 | 0.061174 | -1.753929 | 0.196065 | 1.786530 | 1.867115 | 0.425324 | -0.008679 | 1.847809 | -0.449667 | -1.354326 | -0.727914 | 0.932046 | -0.043380 | -0.014677 | 0.186440 | 0.267574 | 0.998732 | -1.069077 | 0.245595 | -0.200116 | -1.865234 | -0.415148 | 1.427532 | 0.499440 | 0.515214 | -0.766030 | -0.030026 | -0.098962 | -0.180165 | -0.670489 | 0.736356 | -0.652399 | -0.366143 | -0.275072 | -0.752270 | 0.304050 | 1.362900 | 0.774917 | 1.113267 | 0.273226 | -0.075114 | -0.028965 | 0.434045 | -0.534793 | -0.353193 | -1.492413 | -1.246450 | -1.551873 | -1.119170 | 0.063005 | 1.618044 | -0.597430 | 0.771738 | 0.042392 | -0.662651 | -2.104118 | -0.877643 | 0.223441 | 0.811828 | 0.978767 | 1.818003 | -0.246536 | -0.195644 | -0.701866 | 0.694420 | -1.777947 | 0.155694 | -0.656545 | 0.182863 | -1.753347 | 0.041724 | 0.406391 | -0.592952 | -0.764573 | 0.527027 | -0.920914 | -0.299086 | -0.586859 | -0.572754 | 3.160250 | 1.353029 | 1.374156 | -0.235573 | -0.580445 | 0.944786 | -1.145950 | -0.833936 | 0.875651 | 0.235540 | -1.224478 | 1.107995 | -0.787939 | -0.875779 | 0.615018 | 0.969976 | -0.671742 | 1.165146 | -0.136290 | -1.163093 | 0.981314 | -0.922451 | 1.087327 | -0.157392 | 0.574125 | -0.450969 | -0.803623 | -0.272428 | 1.112110 | -0.540566 | 0.069059 | -0.445253 | -0.809559 | 1.380042 | -0.082742 | 0.367067 | -0.582912 | -0.374845 | 0.365666 | -0.361787 | -0.295666 | -0.961766 | 0.350184 | 0.111715 | 0.999413 | 0.388126 | -0.924647 | 0.283527 | -0.241533 | 1.039294 | -0.929047 | -0.451482 | 0.774453 | 1.0 |
| 9 | 0.104803 | -1.145275 | 1.840620 | -1.014149 | -1.523386 | 1.022684 | 1.240107 | 0.574703 | -2.241852 | 1.289968 | 0.097783 | -0.376325 | 0.386885 | 0.948339 | 0.692890 | -0.089031 | 0.581502 | -2.440453 | -0.696002 | 0.007096 | 1.513564 | -1.366061 | -0.224417 | -0.322538 | -0.119994 | -0.678069 | 0.839827 | -1.822983 | 0.722449 | 0.084134 | 1.276967 | 0.094912 | 1.021314 | 2.217640 | -1.539731 | -2.142110 | 0.288010 | -1.698160 | -0.971501 | -0.326496 | -0.692488 | 0.002883 | 0.478152 | 0.619640 | -1.434692 | -1.752309 | -0.093638 | 0.084375 | 0.385860 | 0.121913 | -1.607701 | -1.996891 | 0.769467 | 0.878002 | -2.419913 | -1.285511 | 0.305396 | 0.196065 | 1.786530 | -0.661917 | -1.343141 | 0.605503 | 0.931145 | 1.953790 | -1.634429 | -0.727914 | 0.915575 | -0.669590 | 0.914821 | 0.041441 | -0.238466 | -2.070588 | 1.266594 | -1.175655 | -0.200116 | -1.865234 | -0.415148 | 1.427532 | 0.499440 | 0.515214 | 0.587452 | 0.700899 | 1.722462 | -0.180165 | -0.670489 | 0.736356 | -0.652399 | -1.064809 | 0.246860 | -0.255757 | 2.686602 | 1.650712 | 1.358337 | 1.113267 | -0.376113 | 0.648271 | 2.148777 | -1.120222 | -0.749182 | -0.219363 | -0.992418 | -0.799594 | -1.526530 | 0.851356 | -0.294986 | 1.463430 | 1.516631 | 1.484668 | 0.846679 | -0.726060 | -1.314772 | -0.146666 | 1.021429 | 1.460069 | -0.798674 | -0.000962 | -0.246536 | -2.706951 | -0.337945 | 1.420516 | 0.525940 | -0.036741 | 1.745852 | -0.077108 | -0.905906 | 0.813190 | -1.886615 | -0.442281 | 0.575082 | 1.434017 | 1.129205 | 1.323320 | 0.564870 | 0.731487 | -0.068570 | -0.132303 | 1.158159 | 0.185283 | -0.580445 | 1.308993 | 0.453169 | 0.128064 | 1.321254 | 0.532669 | -1.224478 | 1.107995 | -0.787939 | -0.875779 | 0.270055 | 2.061269 | 1.397033 | -1.058437 | -0.862370 | -0.881032 | -0.528420 | -0.907042 | -1.256241 | -0.157392 | 1.685628 | -0.469442 | -0.223232 | -0.272428 | 0.697251 | -0.402596 | -0.912711 | -1.278179 | -0.809559 | -0.744957 | 1.982019 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 0.123014 | -0.621826 | -0.620544 | 0.353532 | -0.999941 | 0.283467 | 0.593498 | 0.024142 | 1.158248 | -1.044435 | 0.411495 | 0.062377 | 0.569019 | 0.302383 | 1.0 |
| 10 | -0.251573 | 0.851067 | -0.238748 | -0.889267 | 0.075629 | 0.694362 | -1.272553 | 0.141048 | -2.317934 | 1.077456 | -0.069394 | 1.049776 | -0.560103 | 1.023335 | 0.322962 | 2.580719 | 0.304053 | 0.450554 | -0.213269 | -1.105669 | 1.276011 | -0.397495 | -0.423080 | 1.222153 | -0.064749 | 0.060201 | 1.726755 | 1.814780 | -2.193923 | -1.180442 | 1.634265 | 0.094912 | 1.657748 | -1.362616 | -1.305984 | -2.018940 | -1.053316 | 0.647876 | 1.014438 | 0.509471 | -0.768181 | 0.313394 | -0.600349 | 0.003223 | -1.747537 | 1.840152 | 1.067221 | 0.073535 | 0.489691 | 1.478308 | 0.556439 | -1.597278 | -0.990040 | -0.963170 | 0.537002 | -1.420179 | -1.086040 | 0.196065 | 0.550541 | 0.362353 | 0.779017 | -0.181225 | 1.554477 | -0.135101 | -0.493820 | -0.727914 | 0.917309 | -0.746662 | 0.036880 | 0.186440 | -0.026030 | 0.250118 | 1.266594 | -1.170193 | -0.526549 | -0.826884 | 1.580912 | -0.587076 | 0.396026 | -0.421323 | -0.850623 | 0.608843 | 0.464128 | -0.456127 | 0.142453 | -0.267880 | -0.374577 | 0.846994 | -0.520318 | 0.051608 | 1.217362 | -0.237745 | 1.441682 | 0.955254 | 0.603048 | -0.075114 | -0.028965 | 0.436680 | -0.583186 | -0.620854 | -0.492424 | -0.129311 | -1.470776 | 2.533465 | -1.803485 | -0.102754 | -0.007027 | 1.779040 | 0.363020 | 0.394161 | -0.907053 | -1.059161 | 0.703331 | -0.495535 | -0.850444 | -0.416342 | 0.026646 | 0.019478 | -1.042416 | 1.731017 | -0.746901 | -0.030106 | 0.611387 | 0.435298 | -1.220215 | 0.628628 | 0.077457 | 1.362343 | -1.284369 | 0.527027 | 1.622500 | 2.173153 | -1.571247 | 0.330736 | -0.694766 | -0.689302 | 0.150175 | -0.488087 | -0.580445 | 2.554347 | 0.837643 | 0.448731 | -0.206529 | -0.176524 | -0.701534 | 0.044411 | -0.076075 | 1.641084 | -0.176764 | -0.947360 | 0.299568 | 2.359890 | 0.213211 | 0.004329 | 0.266006 | -0.944540 | 1.886554 | 0.931728 | -1.116932 | -0.508032 | -1.156699 | 0.068910 | -1.034019 | -0.494353 | -1.366065 | 0.294203 | 0.651618 | 1.158549 | 2.040922 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -0.195940 | -0.034738 | 2.063871 | 0.462487 | 0.223731 | -0.221897 | -1.269731 | 0.564531 | 2.396714 | -2.341430 | 0.599835 | 1.466077 | 0.782258 | -0.411947 | 1.0 |
| 11 | -1.305030 | -0.295197 | -0.196644 | 0.787541 | -0.899710 | 0.065423 | 1.319303 | -1.870629 | -2.187508 | 0.055000 | 0.872225 | 0.615534 | 0.386885 | 1.135829 | -0.979410 | -0.031442 | 0.515958 | -2.423074 | -1.348142 | -0.791661 | 1.526909 | 0.350516 | 2.317848 | -0.424897 | -0.064749 | 0.332195 | 0.855194 | -1.800897 | 0.975184 | 0.079905 | 1.276967 | 0.094912 | 1.248753 | 2.000396 | -0.920018 | -1.304138 | -0.502127 | -1.787665 | -0.971501 | -0.909473 | -2.083356 | 0.482763 | 0.518936 | 0.222525 | 0.444581 | -0.691323 | 0.623560 | -1.636567 | -0.237128 | -1.287214 | -1.447789 | -0.014610 | 0.692520 | -0.076390 | -0.071476 | -2.183300 | 0.861970 | 0.196065 | -0.685447 | 0.013117 | 0.071631 | 0.040620 | 1.444477 | -0.016233 | 0.202395 | -0.727914 | 0.968889 | -0.948976 | -0.570470 | -0.828552 | 1.248833 | -0.034356 | -1.026351 | -0.097427 | 0.887994 | -1.402825 | 0.290266 | -0.587076 | 0.717480 | 2.304036 | 0.587452 | -0.459007 | 0.546485 | -0.617087 | -0.668174 | -0.429481 | 0.736710 | 0.116571 | 2.334591 | -1.154209 | -0.666745 | 0.087319 | 1.108300 | 1.113267 | -0.664708 | -0.075114 | 0.750902 | -1.175998 | -0.761280 | -0.487023 | -0.742421 | -1.618830 | -1.607628 | -1.119170 | 0.063005 | 1.618044 | -0.597430 | 0.771738 | 0.042392 | -0.662651 | -1.256061 | -0.647066 | 0.780113 | 1.454023 | 0.111617 | 1.405453 | -0.246536 | -0.789565 | -1.706823 | 0.049191 | 2.565222 | 1.111234 | 1.493219 | 1.109714 | 0.693491 | -1.563959 | -2.182179 | -0.250519 | 0.250340 | -1.992389 | 0.577076 | 0.434860 | -0.467304 | -0.572257 | -0.068570 | -0.132303 | -0.497814 | 2.289562 | -0.580445 | 1.837681 | -1.350979 | -1.154603 | 0.048101 | 0.467965 | -1.161192 | 1.064915 | 0.438640 | -0.875779 | 0.869232 | -0.219690 | -0.901517 | -1.115397 | -0.875037 | -0.387987 | 0.639047 | -0.977465 | 0.219717 | -0.157392 | 1.059078 | -0.727031 | -1.156699 | -1.022856 | -1.034019 | -0.899540 | -0.808767 | 0.105296 | 2.073032 | -0.744957 | 1.018369 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 0.928824 | 2.083586 | 0.606205 | 0.198621 | -0.298067 | 0.416107 | 0.241773 | -0.534322 | 2.526623 | -0.426818 | -0.027964 | 1.613318 | 1.909378 | -0.022017 | 1.0 |
| 12 | -1.446708 | 0.171238 | -0.238748 | -0.889267 | 0.075629 | 0.924326 | -1.769325 | 2.586380 | -0.557183 | -0.177751 | 0.352703 | -0.871526 | -1.312958 | -1.579018 | 0.322962 | -0.186620 | -0.935884 | 0.614524 | -1.747752 | 1.148104 | 1.635009 | -0.226139 | -0.366522 | -0.526277 | -0.530875 | 0.060201 | 1.220248 | -1.980735 | 0.099276 | 0.261238 | -0.022297 | 0.094912 | 1.684584 | 0.551019 | -0.535076 | -0.665341 | -0.185872 | -0.960609 | 1.099340 | -0.513489 | -1.392653 | -1.990736 | -0.532376 | 0.252161 | -1.348769 | 0.630256 | 0.740313 | 0.201407 | -0.174830 | -1.099721 | -2.012811 | 1.683369 | -0.846407 | -2.622928 | -2.064968 | -0.208163 | -0.139863 | -0.563748 | 1.786530 | -0.378309 | -1.343141 | 1.437422 | 0.857812 | -0.016233 | 2.666756 | -0.727914 | 0.920343 | 1.247575 | -1.925228 | 0.476438 | 0.195739 | 1.163427 | 1.266594 | -0.142217 | 0.343939 | -1.504532 | 0.676937 | -0.437869 | 0.158050 | -0.884326 | 0.756637 | -1.355634 | -0.958920 | -0.456127 | 0.142453 | -0.267880 | -0.374577 | 0.869224 | 0.963732 | 0.398379 | 0.419018 | -0.376550 | 1.504191 | 0.481216 | -1.535646 | 0.980967 | -0.696021 | -0.674452 | 0.281965 | -1.290004 | -0.742421 | -0.054835 | 0.592148 | 1.617918 | -1.835581 | -0.851487 | -1.589957 | 0.757939 | 0.017937 | 1.070521 | -0.936409 | 0.446946 | 1.120150 | -0.132345 | 0.489108 | 1.299545 | -2.158809 | 0.187834 | -0.027443 | 1.591159 | -0.888328 | -1.264344 | -0.413445 | 1.686171 | 0.721341 | -0.652227 | -0.132299 | 1.807506 | -0.344924 | 2.441783 | 1.391692 | 1.361949 | -1.470397 | 1.415617 | -0.068570 | 1.353029 | 1.734150 | 1.111166 | -0.580445 | 1.009403 | 1.401768 | -0.833936 | 0.727116 | 0.232135 | -1.217816 | 1.272606 | -0.893827 | 3.355237 | -0.108550 | -0.149862 | 0.599433 | -0.603253 | 0.124166 | -0.968409 | 0.331845 | -0.857095 | 0.605447 | 0.881600 | 0.324563 | -0.957279 | -0.761238 | 0.744486 | -1.034019 | -0.581786 | 1.286191 | 1.160207 | -0.809559 | -0.744957 | 2.452365 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | 2.035091 | -0.034738 | 2.102529 | 0.211403 | -1.511814 | -1.033178 | -0.801884 | 0.792780 | 0.950393 | -1.569409 | 0.442885 | -0.173209 | -0.847498 | -0.535529 | 1.0 |
| 14 | -0.737510 | 2.204895 | 0.901897 | -1.184030 | -0.008529 | -0.233807 | 1.132113 | -1.449020 | -0.698478 | 0.295974 | 1.164467 | -0.187133 | -0.687946 | -0.221595 | 0.322962 | 0.036146 | 0.180397 | 0.450554 | -2.000311 | 0.121094 | 0.707487 | -0.023434 | 1.116489 | 1.041433 | -1.335374 | -2.019537 | -0.199795 | -1.065773 | 0.223319 | -1.481255 | -1.256597 | 0.094912 | -0.328766 | -0.104216 | -0.728826 | 0.628529 | 0.404472 | -1.031013 | 1.016717 | -0.513489 | -1.042570 | 0.443949 | -0.913024 | 0.015078 | 1.832555 | 0.248673 | 0.780342 | -0.218943 | 0.178197 | 1.982196 | -0.040565 | 0.620411 | 0.159026 | -0.467493 | -0.216352 | 1.587417 | -0.919067 | -0.563748 | -0.685447 | -0.968963 | 1.132710 | -0.616699 | 2.104475 | -0.540993 | 0.434018 | -0.727914 | 0.898237 | -2.211029 | -0.317092 | 0.041441 | -2.452347 | -0.393691 | -1.026351 | 1.090043 | -0.308927 | -0.956973 | -0.796595 | 0.308323 | 0.247759 | -0.233742 | -1.654253 | 1.315834 | 1.454324 | 0.808645 | -0.386708 | 0.984529 | 0.088459 | -1.045754 | 2.542106 | -0.421262 | 0.426203 | -0.227068 | 0.545717 | 1.113267 | -0.072058 | -0.075114 | -0.028965 | 0.495531 | -0.653242 | -0.620854 | -0.492424 | -0.327913 | -1.369403 | -0.557618 | 1.020938 | 0.717560 | 0.159169 | 0.426771 | -0.653208 | 0.108822 | -0.127493 | 0.898288 | -0.796669 | -0.550096 | -0.787889 | -0.510037 | 1.775010 | 0.042861 | -1.095836 | -0.211608 | 2.533287 | 0.520657 | 2.923217 | 1.290564 | 0.554240 | -1.047188 | -0.904579 | 0.735690 | -0.353691 | -0.480740 | 1.264974 | -0.299086 | 1.127051 | -0.028328 | -0.675197 | -0.792450 | 0.294173 | -0.319744 | -0.580445 | 0.034267 | 1.104618 | 0.128064 | -0.089824 | -0.754606 | -1.450976 | 2.498806 | 0.538718 | 0.596826 | 1.299165 | 0.236883 | 0.143962 | -0.216823 | 1.237636 | -0.500700 | -0.327908 | -0.905275 | 0.755421 | -0.184360 | 2.080267 | -0.513625 | -1.156699 | 0.074306 | -0.503939 | -0.558274 | 0.203292 | 0.435457 | 0.251655 | 1.414997 | -0.082742 | 1.380837 | -0.582912 | -0.374845 | 0.365666 | 0.441762 | -2.053497 | -1.018464 | 0.211403 | -1.511814 | -0.031634 | 0.694265 | 1.675514 | 1.071642 | 2.352458 | -1.409121 | -0.035784 | 2.061691 | 1.463184 | 1.0 |
| 15 | -0.344301 | 0.818583 | 0.901897 | -1.184030 | -0.008529 | 1.139051 | 1.895271 | -0.304653 | -0.328938 | 1.293168 | 1.447275 | -0.650519 | -0.624814 | 0.220880 | 0.322962 | 2.474508 | 0.153098 | 0.450554 | -2.364762 | -0.335412 | 0.989080 | -0.023434 | 1.116489 | 1.041433 | -1.335374 | -2.019537 | -0.467660 | -1.357930 | 0.628920 | -1.450064 | -0.509520 | 0.094912 | -0.549959 | 0.832062 | -1.765415 | -0.093801 | -0.718988 | 0.648455 | 1.122547 | -0.265999 | -0.862798 | 0.228709 | -0.888100 | 0.015078 | 2.251151 | 0.397583 | 1.037199 | 0.163065 | -1.628468 | 1.982196 | 0.801637 | -0.665338 | 0.159026 | 0.258080 | 0.467462 | 2.170981 | -0.306836 | -0.563748 | -0.685447 | 0.535799 | 2.901175 | 0.139218 | 2.471141 | -0.238024 | -0.287784 | -0.727914 | 0.953285 | -0.688858 | 0.335467 | 0.041441 | -1.704645 | 0.280062 | -1.211496 | 0.890129 | 0.561561 | -0.690881 | 0.238013 | 0.233720 | -0.020120 | 0.073704 | -0.596845 | 0.119105 | 0.077243 | 0.808645 | -0.386708 | 0.984529 | 0.088459 | -0.445537 | -0.048692 | -1.138447 | 0.262072 | 0.577525 | -0.183558 | 0.244197 | -0.654401 | -0.075114 | -0.028965 | 0.663299 | -0.726955 | 0.315957 | -0.742421 | 0.069292 | -1.526530 | -0.557618 | 1.020938 | 0.717560 | 0.159169 | 0.426771 | -0.653208 | 0.108822 | -1.422410 | -0.818772 | -0.903616 | -0.366310 | -0.964770 | -0.612880 | -0.100839 | -1.579903 | -1.366273 | -0.319506 | -0.085389 | -0.016834 | 2.327384 | 0.186630 | 0.355310 | -1.700534 | -0.098928 | 0.194645 | -0.875202 | 0.023144 | 1.264974 | -0.839889 | 0.594652 | 1.034718 | 0.107547 | -0.070414 | 2.526137 | -0.993114 | -0.580445 | 3.294510 | 0.976322 | 0.128064 | 0.164807 | -0.975963 | -1.157861 | 1.685062 | -1.405336 | 0.577158 | 0.404365 | -0.947360 | -0.662026 | 2.676662 | 0.828150 | -0.559715 | -0.621216 | -0.941628 | 0.859906 | 0.487374 | -1.116932 | 0.360996 | -1.156699 | -0.271233 | -0.588571 | -0.727765 | 0.844124 | -0.191212 | 0.227137 | -0.744957 | -0.082742 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -0.196556 | -1.953933 | -1.066916 | 0.211403 | -1.511814 | 1.377339 | 0.843976 | -0.059928 | 0.829145 | 1.765722 | -1.911361 | -0.310634 | 0.310085 | 0.523732 | 1.0 |
| 16 | 0.297207 | 1.061962 | 2.666353 | -0.253108 | 0.299852 | -0.380652 | 1.175311 | -0.461250 | -0.252856 | 1.197962 | 1.948509 | -1.602062 | -0.405428 | -0.671569 | -0.210186 | 0.265465 | -1.212522 | 0.387837 | -1.754140 | 1.519111 | 0.252401 | 1.376647 | 0.910721 | 1.056126 | -1.549447 | 1.521938 | 1.080841 | -0.811477 | 1.369180 | 0.232161 | 1.147041 | -2.280007 | 1.348428 | 0.343601 | 0.838980 | 2.122768 | -0.859546 | -0.355162 | 1.171490 | 0.564469 | 0.114909 | 0.634490 | 0.165478 | -2.077182 | -0.817814 | -1.929140 | 0.877081 | -2.031824 | 0.821951 | -0.089017 | -1.522414 | -1.908204 | -0.872055 | -0.664558 | -0.595927 | 1.138522 | -0.473808 | -1.323562 | 0.550541 | 1.583507 | -1.343141 | -0.544805 | 1.004479 | -1.760117 | -2.346804 | -0.727914 | -1.080001 | 0.052960 | -0.474323 | 0.041441 | 0.750161 | -0.498497 | -0.499401 | 0.263074 | 0.561561 | -0.690881 | 0.238013 | 0.233720 | -0.020120 | 0.073704 | -1.019808 | 0.954974 | -0.376677 | -1.042510 | -1.105876 | -0.775769 | -0.930221 | -0.531282 | 0.422934 | -0.161183 | -0.041609 | 1.191114 | 0.483207 | 1.192273 | -1.030605 | -0.148369 | -1.201218 | -1.023163 | -0.309714 | 0.583618 | -0.242426 | -0.427215 | 0.713794 | -0.628926 | -1.763983 | 1.453409 | 1.292324 | 0.798415 | -1.454778 | 0.108822 | 0.453100 | -0.843302 | -0.168698 | -0.916763 | -1.495414 | 0.294175 | 0.226979 | -0.340617 | -1.536548 | 1.543769 | -1.663894 | 0.261865 | 2.208218 | 0.683965 | -0.130079 | 0.078636 | 0.015484 | 0.067944 | -0.427825 | -0.682293 | 1.142782 | 1.130177 | 1.867649 | -0.905513 | 0.851154 | 1.146733 | 0.150175 | -0.403915 | -0.580445 | -0.852754 | -1.301265 | 0.128064 | 0.589191 | -0.226755 | -1.157861 | 1.685062 | -1.405336 | 1.200873 | 0.420405 | -0.947360 | -0.876896 | -0.935970 | -0.583976 | -0.710080 | -0.789472 | -0.854623 | -0.953826 | 0.472973 | 2.407136 | 0.593815 | -0.223232 | -1.022856 | -0.343837 | 0.260646 | -0.536724 | -0.079099 | 1.424107 | -0.744957 | 1.049888 | 0.516432 | -0.582912 | -0.374845 | 0.365666 | 0.747563 | -0.034738 | 2.063871 | 0.059840 | 0.140073 | 2.069268 | -0.119068 | -0.119978 | 0.915751 | 0.129037 | -0.969662 | -0.212473 | -0.040236 | 0.123223 | 1.0 |
| 17 | 0.373297 | 0.344319 | 2.666353 | -0.253108 | 0.299852 | 1.238795 | 1.679283 | -0.340791 | -0.361545 | 0.386791 | 0.939878 | -1.080389 | -0.405428 | -0.214095 | 0.116819 | 0.316501 | 0.387437 | -2.190342 | -2.649286 | -0.278156 | 3.440674 | -0.074417 | -0.658406 | 1.680075 | -1.518372 | 3.202011 | 0.963737 | -0.631639 | 1.510697 | 0.267054 | 1.147041 | -1.602331 | 1.480858 | 0.930777 | -0.872104 | 1.451627 | 0.067133 | 0.569105 | 1.264232 | -0.117505 | 0.348297 | -1.786082 | 0.068050 | -2.492078 | 0.420347 | -0.970530 | 0.106510 | 1.123204 | 0.635055 | 0.075040 | 0.577761 | -0.667557 | -0.189798 | 0.804134 | -1.285536 | -1.150842 | 1.084600 | 1.715693 | 1.786530 | 0.165468 | 0.071631 | 1.215577 | 1.187811 | 0.562162 | -0.298557 | -0.727914 | -1.080001 | -0.977878 | 0.489062 | 0.911435 | 0.108923 | -0.034356 | -0.143353 | -0.495072 | 0.561561 | -0.690881 | 0.238013 | 0.233720 | -0.020120 | 0.073704 | -0.427660 | 0.929198 | 0.703537 | -1.042510 | -1.105876 | -0.775769 | -0.930221 | -0.118435 | 0.246860 | -0.003560 | -0.186075 | 1.011023 | 0.254007 | 1.192273 | 1.123550 | 3.062607 | -0.480209 | -0.771511 | -0.268355 | -0.219363 | -0.492424 | -0.377564 | -1.100767 | 0.096889 | 0.867867 | -1.215116 | 0.843711 | -0.031344 | 0.121190 | 1.630631 | 0.081260 | -1.603714 | -0.881678 | -0.933539 | -0.028593 | 0.511649 | 0.226979 | -1.334384 | -0.765302 | -0.150986 | -1.198554 | 1.894245 | -0.318112 | 0.017084 | -1.793133 | -0.973363 | 0.630448 | -0.123818 | -0.714832 | -1.387729 | -0.821349 | -0.608116 | 0.007848 | 0.700219 | 0.127116 | 0.197771 | 0.870164 | -0.824771 | -0.580445 | -0.852754 | 0.456245 | -0.833936 | 0.928699 | -0.335731 | -1.157861 | 1.685062 | -1.405336 | 0.287351 | -0.109487 | -0.947360 | -0.577275 | -0.647711 | 1.060968 | -0.359649 | -0.631549 | -0.559840 | -0.396318 | -0.890955 | 1.336398 | 2.609308 | -0.699370 | 1.458639 | -0.801249 | -0.899540 | -0.067009 | 1.666606 | 1.697549 | 2.125577 | 2.180817 | -0.834202 | -0.582912 | -0.374845 | 0.365666 | -1.740768 | -1.816602 | -0.903521 | 0.121622 | 0.032311 | -0.357869 | 0.234096 | 0.978935 | -0.677800 | 0.437846 | -1.063832 | -0.212473 | -0.040236 | 0.123223 | 1.0 |
1) Gaussian Naive bayes without any sampling gives a sensitivity of 48% with a type 2 error rate of 52%, while predicting 9 observation to have failed, adding a threshold of 0.016 gives a sensitivity of 71% and reduces type 2 error rate by 23% with and increase in type 1 error rate by 17% while predicting 14
observations to have failed.
2) Random Forest with Randomly over sampled data gives 100% specificity and type 2 error while predicting 1 observation to have failed, adding a threshold of 0.1688 gives a sensitivity of 71% and type 2 error rate reduces by 71% with and increase in type 1 error rate by 27% while predicting 7 observations as
failed.
3) Futher more we could have tried modeling the data based on only the important variable from variable importance plot to check if performance of the model increases.
#Load the data and pre-processing for pca
com=pd.read_csv('./signal-data+Future.csv')
com=com.drop(['Time'],axis=1)
#dropping the columns the have constant signal
cols = com.select_dtypes([np.number]).columns
std = com[cols].std()
cols_to_drop = std[std==0].index
com.drop(cols_to_drop, axis=1,inplace=True)
#label encoding the target class
com['Pass/Fail']=com['Pass/Fail'].replace([-1,1],[0,1])
#replacing the NaN/NA with zero and considering it as no signal
com.fillna(0,inplace=True)
row,column=com.shape
print('The dataset contains', row, 'rows and', column, 'columns')
The dataset contains 1585 rows and 475 columns
sg=com.iloc[0:1567,:]
y=sg['Pass/Fail']
sg=sg.drop(['Pass/Fail'],axis=1)
row,column=sg.shape
print('The past dataset contains', row, 'rows and', column, 'columns')
The past dataset contains 1567 rows and 474 columns
val=com.iloc[1567:1586,:]
val=val.drop(['Pass/Fail'],axis=1)
row,column=val.shape
print('The validation dataset contains', row, 'rows and', column, 'columns')
The validation dataset contains 18 rows and 474 columns
#scaling with z-score
comScaled= com.apply(zscore)
#dropping NaN
comScaled.dropna(axis=1,inplace=True)
comScaled=comScaled.drop(['Pass/Fail'],axis=1)
#splitting the dataset into train and validation set
X=comScaled.iloc[0:1567,:]
val=comScaled.iloc[1567:,:]
X.shape,val.shape
((1567, 474), (18, 474))
from sklearn.decomposition import PCA
#extracting componenets that explaint 95% of the variation
pca = PCA(.95)
pca_ = pca.fit_transform(X)
pca
PCA(n_components=0.95)
X_pca = pca.transform(X) # PCs for the train data
val_pca = pca.transform(val) # PCs for the test data
X_pca.shape, val_pca.shape
((1567, 158), (18, 158))
pca.explained_variance_
array([26.16675992, 18.06107703, 13.70927143, 12.29759902, 10.68546427,
9.95331148, 9.35729925, 8.64823007, 8.08850525, 7.79321193,
7.09249305, 6.72414102, 6.62861272, 6.40165754, 6.21690301,
6.11813448, 5.98425001, 5.64375621, 5.52390322, 5.32483464,
5.24356088, 5.11652015, 4.89806327, 4.87611358, 4.78524547,
4.55731408, 4.55014951, 4.51054696, 4.41780524, 4.31966871,
4.19704364, 4.10602219, 4.00232687, 3.80286722, 3.76971573,
3.65748448, 3.59958841, 3.52983411, 3.48399387, 3.44072958,
3.353488 , 3.2958605 , 3.2225608 , 3.15668446, 3.11604997,
3.04538131, 3.03512913, 2.95745781, 2.92179189, 2.89733894,
2.83098572, 2.75049517, 2.73566038, 2.68415761, 2.64588815,
2.58646971, 2.52657042, 2.46606138, 2.41574704, 2.40307721,
2.39838111, 2.34932852, 2.33184995, 2.30230267, 2.23268507,
2.20141171, 2.16673249, 2.12754602, 2.11916282, 2.0744845 ,
2.03800282, 2.01010478, 1.96980494, 1.9541063 , 1.91815398,
1.88580381, 1.87497006, 1.82829522, 1.81366643, 1.75414635,
1.71645182, 1.68588229, 1.62684268, 1.59440729, 1.58072669,
1.53665979, 1.48787667, 1.47501257, 1.44904352, 1.41482642,
1.40791826, 1.39098015, 1.37296634, 1.3620427 , 1.31507312,
1.30625315, 1.29028314, 1.24975958, 1.24607143, 1.20550936,
1.18555758, 1.17655053, 1.16166167, 1.13155167, 1.1272968 ,
1.1120928 , 1.07942703, 1.07167768, 1.05823454, 1.03664846,
1.0277201 , 1.01696247, 1.01007544, 0.99778863, 0.98174266,
0.97716602, 0.95058551, 0.94054336, 0.92501523, 0.91011522,
0.9051339 , 0.88944079, 0.88424035, 0.8734794 , 0.86841133,
0.85644853, 0.8388387 , 0.83199791, 0.82759449, 0.81387677,
0.80413196, 0.80019534, 0.78692071, 0.78500161, 0.77423149,
0.75788977, 0.74168079, 0.72878388, 0.71747416, 0.71574486,
0.7058286 , 0.70069105, 0.69479867, 0.690416 , 0.67980762,
0.67394823, 0.65533052, 0.64430608, 0.63561071, 0.63209706,
0.62624229, 0.62234877, 0.60665287, 0.59850219, 0.59434714,
0.56651921, 0.5621297 , 0.54754901])
#creating a dataframe of the componenets and attaching it to the target class
pca_df = pd.DataFrame(data = X_pca)
df=pd.concat([pca_df,y],axis=1)
df.head()
| 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | -1.391640 | 5.150359 | 3.178877 | -0.466964 | 0.901184 | -1.780250 | 0.701285 | 1.120895 | -1.294508 | 1.767426 | -0.146152 | 2.210787 | 0.135200 | 0.988529 | 0.098269 | -0.655160 | 0.175537 | 0.054897 | 4.686229 | -0.510988 | -5.115006 | 6.439160 | 2.656116 | -1.177485 | -0.392285 | -0.442233 | 0.147694 | 2.816576 | -0.323780 | -2.897327 | -0.654170 | 2.386776 | 1.246300 | 2.452147 | 5.629562 | -0.493198 | -3.588602 | 1.148772 | -1.214649 | 0.078553 | -0.714313 | -0.446398 | 1.961985 | 0.267989 | 2.875115 | 0.084659 | -0.631158 | -0.947082 | -1.341994 | 0.153465 | 1.908460 | 1.849122 | 0.218385 | 2.557805 | -0.149592 | -2.618729 | 0.170473 | -1.328606 | 2.293220 | 2.248078 | 1.966101 | 1.325839 | 1.440740 | 1.426798 | 1.268302 | 0.532285 | 1.502170 | 0.199800 | 1.250848 | 0.120835 | -0.877501 | -1.954916 | 1.308044 | 0.841130 | -0.484948 | -0.744309 | -2.600927 | -1.600076 | 3.105652 | 0.381581 | 1.089711 | -2.369249 | -2.065220 | -0.670421 | 0.389163 | 1.075784 | 0.390468 | -0.570276 | 1.061772 | -0.284645 | -0.433508 | 1.583067 | 0.511803 | -0.200919 | 1.908189 | -2.123296 | 0.377242 | -1.126270 | 0.424268 | 0.117552 | 1.901831 | -1.036129 | -1.039925 | 0.870009 | -0.413187 | 0.354025 | -0.538659 | 0.701122 | 0.830811 | 0.808123 | -0.146746 | 0.235897 | -1.444138 | -0.080273 | -0.279693 | -2.213227 | -0.370290 | 0.503672 | 0.443912 | 1.334578 | 0.153452 | -0.390029 | -0.121543 | 0.368942 | -0.213267 | -0.856982 | -0.338312 | -0.583021 | -0.662164 | -0.471984 | 0.611847 | 1.978313 | -0.238114 | 0.213458 | 0.865894 | -0.514946 | -1.469315 | -0.030847 | 0.060873 | -0.142889 | 0.741975 | -0.174055 | 0.319032 | 0.594477 | 1.081219 | -0.588235 | -1.257500 | 0.870709 | 0.301398 | -0.128901 | 1.172160 | 0.678042 | -0.949605 | -1.348515 | 0.654438 | 0.483097 | -0.623556 | -0.417134 | 0 |
| 1 | -1.955984 | 3.285026 | 2.729192 | -0.736846 | 1.736133 | -2.657830 | -0.236804 | 0.009248 | -3.098927 | 1.509108 | 0.131649 | 1.334124 | 0.020333 | 3.240083 | -0.852073 | -1.279149 | 0.349486 | 0.736332 | 1.517208 | -2.377710 | -2.642578 | 4.540764 | 1.884094 | -0.195184 | 0.271020 | -1.853049 | -1.146654 | -0.171979 | -0.275408 | -1.564530 | 1.178068 | 2.759176 | -0.312723 | -4.037480 | 1.287546 | 0.637250 | 3.297083 | -0.242817 | -1.141758 | -1.999374 | -1.185483 | -2.050356 | 1.790696 | 1.419176 | -0.249398 | 0.793255 | -1.079468 | 0.048519 | 0.705125 | 1.732044 | 0.665385 | -1.671489 | 0.688902 | -1.849315 | 0.055338 | -1.413829 | -3.892635 | -2.597456 | -1.375360 | 1.250045 | 0.439314 | -0.838653 | 3.043644 | -3.658856 | 1.702032 | -0.557140 | -0.669471 | 1.277059 | 1.671710 | -0.285099 | -0.282536 | 1.001162 | 0.464291 | -1.278491 | 0.488137 | -0.177160 | -0.539076 | 1.063241 | 0.340919 | -0.644769 | -0.043809 | 0.574502 | 0.982286 | 0.709447 | -0.304577 | -0.234670 | -0.550516 | -0.266437 | -2.760611 | -1.133623 | -0.279424 | -0.636219 | 1.482018 | -0.892822 | 1.331060 | 0.771299 | -1.135963 | -0.164534 | -0.291542 | 0.871893 | 0.938480 | -0.351994 | 0.492066 | -0.687524 | -1.651352 | -1.140673 | 0.744223 | -1.435474 | 0.178118 | 1.185510 | 0.208397 | 0.255452 | -1.441405 | -1.300230 | -1.841209 | -0.470599 | -0.651217 | 0.104535 | -1.665387 | -0.369688 | 0.369524 | 1.191900 | -1.993571 | 2.140383 | -1.306756 | -1.480920 | -0.148698 | 1.121667 | -0.244379 | -0.150725 | 0.639116 | 0.350925 | 0.234956 | 0.012909 | 0.585608 | 1.151466 | -0.871276 | -1.013486 | -1.042450 | 0.051934 | 0.915191 | 0.118456 | -1.299790 | 0.913888 | -0.458302 | 1.112634 | 0.332470 | -0.112404 | 1.144501 | 0.943419 | -0.883285 | 0.627312 | -0.927997 | -0.550160 | 0.569785 | -0.255731 | 0.011357 | 1.103516 | 0 |
| 2 | 0.278925 | 1.937571 | -0.139066 | -0.186996 | 0.000854 | 0.101144 | -0.758327 | -1.667386 | -2.252920 | 2.142752 | 2.455236 | 0.437368 | -2.035598 | -0.054022 | 1.944307 | -0.535901 | -0.304464 | 2.705936 | 0.239632 | -1.213717 | 0.620981 | -0.030675 | 2.557970 | -1.608922 | 0.505562 | 3.201052 | -0.882069 | -4.635498 | -0.770801 | -3.574428 | 1.330702 | 3.902943 | 2.544027 | -0.020070 | -3.531717 | -3.183553 | -0.405590 | 1.625177 | 2.049291 | -2.147744 | -3.378837 | -1.741670 | -3.401969 | -1.744007 | -0.646464 | -4.285466 | -0.772173 | 3.050395 | -4.472710 | 1.193505 | -1.530343 | 0.710492 | 0.169889 | -0.122648 | 1.196557 | -1.207668 | -1.335390 | -1.998037 | 0.611191 | -0.713760 | -1.868101 | 1.471241 | 2.605144 | -0.653198 | 1.621636 | 0.371693 | 1.070052 | 2.163889 | 0.323717 | 0.935321 | -0.419916 | -0.187361 | -0.085901 | 0.242981 | 1.466339 | 0.891498 | 0.798790 | 0.207819 | 0.591670 | -0.263704 | -2.194396 | 1.266578 | -1.274232 | -2.931262 | -0.629852 | 0.211246 | 0.946987 | -0.662682 | 0.538487 | 1.900983 | -1.101718 | 1.329419 | 0.776497 | 1.490126 | -2.773633 | 0.639470 | -1.424861 | -2.999525 | 1.983159 | 1.502644 | 1.079523 | 1.235522 | 0.029121 | -1.552666 | 0.301060 | -0.939610 | -2.913030 | 2.102846 | 1.776076 | 0.840810 | 0.068449 | -0.249979 | 1.426895 | -0.018327 | -1.431741 | -4.153746 | 1.564734 | -0.285376 | 0.415364 | -1.171505 | -0.314343 | 1.641471 | -1.496775 | -0.397810 | 1.671291 | -0.113252 | 1.322176 | -0.043678 | -1.172433 | 0.357317 | -0.502624 | -0.644933 | -0.775789 | 0.750153 | -0.516105 | 0.846653 | -0.515951 | 0.021140 | -0.401219 | -1.616632 | 0.499506 | 0.313672 | -0.282122 | -0.159237 | 0.087614 | 0.054005 | -1.335955 | -1.321001 | 0.542853 | -0.305492 | 0.320494 | 0.680499 | -1.541633 | 0.297344 | 0.379577 | 0.463219 | -0.983617 | -0.388313 | 1 |
| 3 | 0.528330 | 2.175852 | -1.693965 | 2.652302 | -0.684325 | -0.241593 | -5.150900 | -2.631122 | -0.789310 | 19.651834 | 6.584746 | -16.192337 | 2.054759 | 4.486971 | -3.775697 | -2.448522 | 8.783194 | -3.880678 | 0.783646 | -3.280692 | -2.298974 | 3.412294 | -0.903023 | -3.117263 | -2.773727 | -1.780158 | -3.333509 | -4.022592 | -1.704069 | -2.383513 | 7.680935 | 3.381884 | 5.684561 | -1.292853 | 1.264532 | -3.360697 | 1.710045 | -0.020923 | 4.460400 | -2.690661 | -2.820947 | -4.580525 | -3.913600 | 2.091732 | -0.830464 | 2.596260 | -0.334023 | 2.606263 | 2.086205 | 3.209863 | 0.148541 | 6.099810 | 6.681021 | 0.368808 | -2.113174 | 0.022069 | 2.088355 | -6.347612 | -2.159596 | 0.143957 | -7.622815 | 6.896746 | 4.403907 | 7.730554 | -7.672943 | 2.169896 | 1.905398 | 3.434648 | -0.330636 | -0.113292 | -4.492562 | 11.265915 | 3.663364 | 5.387365 | -7.471309 | -2.904263 | 6.923769 | -2.543592 | -6.120720 | -3.309150 | 1.210757 | -6.756510 | 7.148960 | -1.922062 | -4.858350 | 3.975204 | -6.068748 | 1.300378 | 5.254519 | 4.538226 | 6.043878 | -2.433198 | 1.298519 | -0.626323 | -1.529590 | 3.689458 | -1.145907 | 6.676622 | 1.218922 | -2.460351 | 4.650600 | 1.157004 | -6.375441 | 1.513282 | 5.271153 | 1.621288 | 0.681633 | -3.320575 | 4.437937 | -1.564535 | 0.492631 | -0.673340 | 0.556452 | -1.405336 | 3.176196 | 2.889401 | -1.128972 | -1.223472 | -3.604625 | 1.776693 | -0.635190 | -0.197444 | -2.719597 | 0.399110 | -1.806255 | 2.782734 | 0.849430 | 0.913012 | -0.980631 | 0.616298 | -1.662987 | 2.789860 | 2.513476 | -0.468072 | -0.405473 | -1.093192 | 2.030645 | -1.103873 | 0.848213 | -0.221899 | -0.864103 | 1.185704 | 1.124685 | 0.812065 | 1.361929 | -0.194173 | 0.276197 | 1.146809 | 0.207608 | 1.374964 | -0.104988 | 0.287760 | -0.351112 | -0.514601 | 0.171187 | 0.720761 | -0.274476 | 0.216156 | 0 |
| 4 | 0.308119 | 4.417323 | 1.840441 | 0.612236 | 0.162877 | 0.119498 | 0.332537 | -0.272154 | 2.250630 | 3.170334 | 0.735757 | -3.247212 | 2.283609 | 1.366154 | -4.941261 | -1.902437 | 0.676186 | -1.913984 | 0.035088 | -4.566823 | -0.578037 | -2.066076 | 0.449146 | -4.381404 | 0.436853 | -4.109056 | 1.588612 | 3.023159 | 0.173118 | -4.866134 | -7.414828 | 1.337723 | -2.863879 | 0.229513 | 2.787015 | 3.748061 | 4.825462 | -4.503855 | 2.089468 | -3.219903 | 2.476945 | -6.941446 | -2.260231 | 3.426524 | -4.170843 | -0.141279 | -0.641335 | -0.327460 | -1.765326 | -2.993638 | -2.476690 | -10.101094 | 6.858996 | 0.445752 | 2.051620 | 0.507909 | -2.420322 | 2.755170 | -1.849488 | 13.599848 | 5.273400 | -2.512991 | 0.035695 | 3.768467 | 8.222702 | -5.794324 | -3.187681 | 3.792847 | 6.092398 | 8.778741 | 11.070746 | 1.825563 | -5.218055 | -2.209795 | -6.929549 | 10.212297 | 9.075266 | -5.313948 | 8.404048 | 6.046228 | -2.537100 | 0.770576 | 5.922444 | -1.991262 | 0.465978 | 1.928767 | -2.599812 | -3.144039 | -2.765545 | 1.326886 | -1.005208 | 1.282151 | -1.728543 | -0.974224 | -0.324821 | 0.286365 | -2.987764 | 2.571216 | -0.641733 | -2.284347 | -0.758108 | 4.137175 | -0.308630 | 3.971004 | -0.682584 | 2.246494 | 0.805599 | -1.639389 | -0.727747 | 1.209131 | -1.099584 | -1.161394 | -0.430024 | 0.640492 | 1.555746 | -0.671016 | -0.242333 | -2.873451 | 0.895486 | 0.451472 | 2.481022 | 1.253874 | 0.274337 | -0.417389 | -1.407498 | 0.218916 | -0.913272 | 0.367293 | -0.735490 | 0.360974 | -0.152363 | -0.637721 | -0.130674 | 0.093623 | 0.761311 | -0.511164 | -1.005618 | 0.254734 | 1.011350 | 1.402527 | -0.444417 | 0.437472 | 0.732216 | 0.308114 | -0.070026 | 0.478572 | -0.986544 | -0.796664 | 1.640639 | 1.167493 | 0.632968 | -0.633622 | -0.639812 | -0.526713 | -0.825353 | -0.191233 | -0.424047 | 0.050306 | 0 |
#seperating the DV and IDV
X=df.drop(['Pass/Fail'],axis=1)
y=df['Pass/Fail']
# splitting data into training and test set for independent attributes
X_train, X_test, Y_train, Y_test =train_test_split(X,y, test_size=.30,random_state=105,stratify=y)
print("Training Fail : {0} ({1:0.2f}%)".format(len(Y_train[Y_train[:] == 1]), (len(Y_train[Y_train[:] == 1])/len(Y_train)) * 100))
print("Training Pass : {0} ({1:0.2f}%)".format(len(Y_train[Y_train[:] == 0]), (len(Y_train[Y_train[:] == 0])/len(Y_train)) * 100))
print("")
print("Test Fail : {0} ({1:0.2f}%)".format(len(Y_test[Y_test[:] == 1]), (len(Y_test[Y_test[:] == 1])/len(Y_test)) * 100))
print("Test Pass : {0} ({1:0.2f}%)".format(len(Y_test[Y_test[:] == 0]), (len(Y_test[Y_test[:] == 0])/len(Y_test)) * 100))
print("")
Training Fail : 73 (6.66%) Training Pass : 1023 (93.34%) Test Fail : 31 (6.58%) Test Pass : 440 (93.42%)
# Initializaing various classification algorithms with normal dataset
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=55,shuffle=True)
cv_results = cross_val_score(model, X_train, Y_train, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 18.810568% (18.008092%) KNN: 0.000000% (0.000000%) GNB: 16.255599% (7.081171%) SVM: 18.793196% (12.056760%) DT: 6.810967% (10.936352%) RF: 0.000000% (0.000000%) AB: 6.341991% (10.111747%) GBT: 2.222222% (6.666667%) XGB: 0.000000% (0.000000%) LightGBM: 0.000000% (0.000000%)
# Implementing random under sampling
under= RandomUnderSampler(sampling_strategy=0.5)
X_under, y_under= under.fit_sample(X_train, Y_train)
print("Under Training Fail : {0} ({1:0.2f}%)".format(len(y_under[y_under[:] == 1]), (len(y_under[y_under[:] == 1])/len(y_under)) * 100))
print("under Training Pass : {0} ({1:0.2f}%)".format(len(y_under[y_under[:] == 0]), (len(y_under[y_under[:] == 0])/len(y_under)) * 100))
Under Training Fail : 73 (33.33%) under Training Pass : 146 (66.67%)
# Initializaing various classification algorithms with under sampled dataset
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=77,shuffle=True)
cv_results = cross_val_score(model, X_under, y_under, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 39.451681% (18.770440%) KNN: 36.588550% (14.558491%) GNB: 36.821068% (19.109300%) SVM: 39.952596% (15.831455%) DT: 30.516817% (12.556153%) RF: 15.521368% (16.508396%) AB: 36.566688% (11.831624%) GBT: 20.833056% (16.010799%) XGB: 29.763015% (17.099830%) LightGBM: 23.311577% (17.170448%)
# Implementing SMOTE
smt = SMOTE(sampling_strategy=0.5)
X_SMOTE, y_SMOTE = smt.fit_sample(X_train, Y_train)
print("SMOTE Training Fail : {0} ({1:0.2f}%)".format(len(y_SMOTE[y_SMOTE[:] == 1]), (len(y_SMOTE[y_SMOTE[:] == 1])/len(y_SMOTE)) * 100))
print("SMOTE Training Pass : {0} ({1:0.2f}%)".format(len(y_SMOTE[y_SMOTE[:] == 0]), (len(y_SMOTE[y_SMOTE[:] == 0])/len(y_SMOTE)) * 100))
SMOTE Training Fail : 511 (33.31%) SMOTE Training Pass : 1023 (66.69%)
# Initializaing various classification algorithms with smote sampled dataset
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=87,shuffle=True)
cv_results = cross_val_score(model, X_SMOTE, y_SMOTE, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 78.727822% (4.429079%) KNN: 61.857199% (1.104295%) GNB: 60.929946% (3.225113%) SVM: 81.435802% (3.034786%) DT: 77.823108% (2.398956%) RF: 94.866948% (3.099321%) AB: 80.355898% (3.628175%) GBT: 95.067420% (2.147922%) XGB: 97.533985% (0.675093%) LightGBM: 97.704184% (1.301273%)
# Implementing random over sampling
over= RandomOverSampler(sampling_strategy=0.5)
X_over, y_over= over.fit_sample(X_train, Y_train)
print("over Training Fail : {0} ({1:0.2f}%)".format(len(y_over[y_over[:] == 1]), (len(y_over[y_over[:] == 1])/len(y_over)) * 100))
print("over Training Pass : {0} ({1:0.2f}%)".format(len(y_over[y_over[:] == 0]), (len(y_over[y_over[:] == 0])/len(y_over)) * 100))
over Training Fail : 511 (33.31%) over Training Pass : 1023 (66.69%)
# Initializaing various classification algorithms with over sampled dataset
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=9,shuffle=True)
cv_results = cross_val_score(model, X_over, y_over, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 81.203355% (2.937267%) KNN: 84.526860% (3.117029%) GNB: 73.482276% (4.820808%) SVM: 84.143746% (3.524154%) DT: 92.537220% (2.209473%) RF: 100.000000% (0.000000%) AB: 90.873870% (4.018143%) GBT: 99.413500% (0.655362%) XGB: 99.708738% (0.444910%) LightGBM: 100.000000% (0.000000%)
oversample = ADASYN(sampling_strategy=0.5)
X_adasyn, y_adasyn = oversample.fit_resample(X_train, Y_train)
print("ADASYN Training Fail : {0} ({1:0.2f}%)".format(len(y_adasyn[y_adasyn[:] == 1]), (len(y_adasyn[y_adasyn[:] == 1])/len(y_adasyn)) * 100))
print("ADASYN Training Pass : {0} ({1:0.2f}%)".format(len(y_adasyn[y_adasyn[:] == 0]), (len(y_adasyn[y_adasyn[:] == 0])/len(y_adasyn)) * 100))
ADASYN Training Fail : 518 (33.61%) ADASYN Training Pass : 1023 (66.39%)
# Initializaing various classification algorithms with ADASYN sampled dataset
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=53,shuffle=True)
cv_results = cross_val_score(model, X_adasyn, y_adasyn, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 79.503491% (2.036465%) KNN: 60.410944% (1.652599%) GNB: 62.089774% (2.839143%) SVM: 80.796210% (2.856949%) DT: 79.743028% (3.914179%) RF: 96.061103% (1.837581%) AB: 80.782828% (2.464212%) GBT: 95.809885% (1.673687%) XGB: 98.935683% (0.809272%) LightGBM: 98.844198% (0.941928%)
from sklearn.model_selection import RandomizedSearchCV,GridSearchCV
param_grid = [
{'penalty' : ['l1', 'l2', 'elasticnet', 'none'],
'C' : [1,2,4],
'solver' : ['lbfgs','newton-cg','liblinear','sag','saga'],
'max_iter' : [100, 1000,2500, 5000]
}
]
lg=LogisticRegression()
log = GridSearchCV(lg, param_grid = param_grid, cv = 5, verbose=2, n_jobs=-1)
log.fit(X_train,Y_train)
log.best_estimator_
Fitting 5 folds for each of 240 candidates, totalling 1200 fits
LogisticRegression(C=1, max_iter=1000, penalty='l1', solver='saga')
lg=LogisticRegression(C=1, max_iter=1000, penalty='l1', solver='saga')
lg.fit(X_train,Y_train)
LogisticRegression(C=1, max_iter=1000, penalty='l1', solver='saga')
modellg= lg.score(X_train, Y_train)
print('Accuracy Score of Training Data: ', modellg)
Accuracy Score of Training Data: 0.9598540145985401
y_predictlg= lg.predict(X_test)
modellg = accuracy_score(Y_test, y_predictlg)
print('Accuracy Score of Test Data:', modellg)
Accuracy Score of Test Data: 0.9129511677282378
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictlg, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.14 0.06 0.09 31
0 0.94 0.97 0.95 440
accuracy 0.91 471
macro avg 0.54 0.52 0.52 471
weighted avg 0.88 0.91 0.90 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictlg)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix Log Reg', fontsize = 15);
#Plotting ROC and AUC
probs = lg.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_lg = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='LogisticReg (AUC = %0.2f)' % roc_auc_lg)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 33 0.384091 0.612903 0.615909 -0.003006 0.037636
# store the predicted probabilities for failed class
y_pred_prob = lg.predict_proba(X_test)[:, 1]
# predict failed if the predicted probability is greater than 0.0376
from sklearn.preprocessing import binarize
y_pred_class = binarize([y_pred_prob], 0.0376)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.10 0.61 0.17 31
0 0.96 0.62 0.75 440
accuracy 0.62 471
macro avg 0.53 0.61 0.46 471
weighted avg 0.90 0.62 0.71 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for Log Reg', fontsize = 15);
precision_lg, recall_lg, f1_score_lg, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_lg)
print('Recall Score :', '%0.2f' % recall_lg)
print('F1-Score:', '%0.2f' % f1_score_lg)
lg_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % lg_acc)
print('AUC :','%0.2f' % roc_auc_lg)
print('Threshold :','%0.2f' % 0.0376)
Thresholdlo=0.0376
Precision Score : 0.53 Recall Score : 0.61 F1-Score: 0.46 Accuracy Score : 0.62 AUC : 0.66 Threshold : 0.04
param_grid = {'C': [0.1,1, 10, 100], 'gamma': [1,0.1,0.01,0.001],'kernel': ['rbf', 'poly', 'sigmoid']}
# Make grid search classifier
svm_grid= GridSearchCV(SVC(), param_grid, verbose = 2,cv=5, n_jobs = -1)
# Train the classifier
svm_grid.fit(X_under, y_under)
svm_grid.best_params_
Fitting 5 folds for each of 48 candidates, totalling 240 fits
{'C': 0.1, 'gamma': 1, 'kernel': 'sigmoid'}
svc_cv = SVC(kernel = 'sigmoid', gamma = 1, C = 0.1,probability=True)
svc_cv.fit(X_under,y_under)
SVC(C=0.1, gamma=1, kernel='sigmoid', probability=True)
modelsv_score = svc_cv.score(X_under, y_under)
print('Accuracy Score of Training Data: ', modelsv_score)
Accuracy Score of Training Data: 0.6027397260273972
y_predictsv= svc_cv.predict(X_test)
modelsv_score = accuracy_score(Y_test, y_predictsv)
print('Accuracy Score of Test Data:', modelsv_score)
Accuracy Score of Test Data: 0.8343949044585988
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictsv, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.14 0.29 0.19 31
0 0.95 0.87 0.91 440
accuracy 0.83 471
macro avg 0.54 0.58 0.55 471
weighted avg 0.89 0.83 0.86 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictsv)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for SVC', fontsize = 15);
#Plotting ROC and AUC
probs = svc_cv.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_sv = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='SVC (AUC = %0.2f)' % roc_auc_sv)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 34 0.363636 0.645161 0.636364 0.008798 0.355506
# store the predicted probabilities for failed class
y_pred_prob = svc_cv.predict_proba(X_test)[:, 1]
# predict failed if the predicted probability is greater than 0.
y_pred_class = binarize([y_pred_prob], 0.3555)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.11 0.65 0.19 31
0 0.96 0.64 0.77 440
accuracy 0.64 471
macro avg 0.54 0.64 0.48 471
weighted avg 0.91 0.64 0.73 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for SVC', fontsize = 15);
precision_sv, recall_sv, f1_score_sv, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_sv)
print('Recall Score :', '%0.2f' % recall_sv)
print('F1-Score:', '%0.2f' % f1_score_sv)
sv_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % sv_acc)
print('AUC :','%0.2f' % roc_auc_sv)
print('Threshold :','%0.2f' % 0.3555)
Thresholdsv=0.3555
Precision Score : 0.54 Recall Score : 0.64 F1-Score: 0.48 Accuracy Score : 0.64 AUC : 0.71 Threshold : 0.36
from scipy.stats import randint as sp_randint
from scipy.stats import uniform as sp_uniform
param_test ={'num_leaves': sp_randint(6, 50),
'min_child_samples': sp_randint(100, 500),
'min_child_weight': [1e-5, 1e-3, 1e-2, 1e-1, 1, 1e1, 1e2, 1e3, 1e4],
'subsample': sp_uniform(loc=0.2, scale=0.8),
'colsample_bytree': sp_uniform(loc=0.4, scale=0.6),
'reg_alpha': [0, 1e-1, 1, 2, 5, 7, 10, 50, 100],
'reg_lambda': [0, 1e-1, 1, 5, 10, 20, 50, 100]}
sample = 100
#n_estimators is set to a "large value". The actual number of trees build will depend on early stopping and 5000 define only the absolute maximum
lgb = LGBMClassifier(max_depth=-1, random_state=31, silent=True, metric='f1', n_jobs=4, n_estimators=2000)
gs = RandomizedSearchCV(
estimator=lgb, param_distributions=param_test,
n_iter=sample,
scoring='f1',
cv=5,
refit=True,
random_state=314,
verbose=True)
gs.fit(X_SMOTE, y_SMOTE)
gs.best_params_
Fitting 5 folds for each of 100 candidates, totalling 500 fits
{'colsample_bytree': 0.952164731370897,
'min_child_samples': 111,
'min_child_weight': 0.01,
'num_leaves': 38,
'reg_alpha': 0,
'reg_lambda': 0.1,
'subsample': 0.3029313662262354}
lgb=LGBMClassifier(colsample_bytree=0.95,
min_child_samples= 111,
min_child_weight= 0.01,
num_leaves= 38,
reg_alpha= 0,
reg_lambda= 0.1,
subsample=0.30)
lgb.fit(X_SMOTE,y_SMOTE)
LGBMClassifier(colsample_bytree=0.95, min_child_samples=111,
min_child_weight=0.01, num_leaves=38, reg_alpha=0,
reg_lambda=0.1, subsample=0.3)
modellgbm=lgb.score(X_under,y_under)
print('Accuracy Score of Training Data: ', modellgbm)
Accuracy Score of Training Data: 1.0
y_predictlgm= lgb.predict(X_test)
modellgm = accuracy_score(Y_test, y_predictlgm)
print('Accuracy Score of Test Data:', modellgm)
Accuracy Score of Test Data: 0.9235668789808917
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictlgm, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.00 0.00 0.00 31
0 0.93 0.99 0.96 440
accuracy 0.92 471
macro avg 0.47 0.49 0.48 471
weighted avg 0.87 0.92 0.90 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictlgm)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for LGBM smote', fontsize = 15);
#Plotting ROC and AUC
probs = lgb.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_lgm = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='LGBM (AUC = %0.2f)' % roc_auc_lgm)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 34 0.456818 0.548387 0.543182 0.005205 0.044132
# store the predicted probabilities for diabetic class for all records...
y_pred_prob = lgb.predict_proba(X_test)[:, 1]
# predict diabetes if the predicted probability is greater than 0.0441
from sklearn.preprocessing import binarize
y_pred_class = binarize([y_pred_prob], 0.0441)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.08 0.55 0.14 31
0 0.94 0.54 0.69 440
accuracy 0.54 471
macro avg 0.51 0.55 0.41 471
weighted avg 0.89 0.54 0.65 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for LGBM smote', fontsize = 15);
precision_lgm, recall_lgm, f1_score_lgm, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_lgm)
print('Recall Score :', '%0.2f' % recall_lgm)
print('F1-Score:', '%0.2f' % f1_score_lgm)
lgm_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % lgm_acc)
print('AUC :','%0.2f' % roc_auc_lgm)
print('Threshold :','%0.2f' % 0.0441)
Thresholdlg=0.0444
Precision Score : 0.51 Recall Score : 0.55 F1-Score: 0.41 Accuracy Score : 0.54 AUC : 0.58 Threshold : 0.04
# Number of trees in random forest
n_estimators = [int(x) for x in np.linspace(start = 50, stop = 500, num = 50)]
# Number of features to consider at every split
max_features = ['auto', 'sqrt']
# Maximum number of levels in tree
max_depth = [int(x) for x in np.linspace(10, 110, num = 11)]
max_depth.append(None)
# Minimum number of samples required to split a node
min_samples_split = range(2,100,5)
# Minimum number of samples required at each leaf node
min_samples_leaf = range(1,100,10)
# Method of selecting samples for training each tree
bootstrap = [True, False]
# Create the random grid
random_grid = {'n_estimators': n_estimators,
'max_features': max_features,
'max_depth': max_depth,
'min_samples_split': min_samples_split,
'min_samples_leaf': min_samples_leaf,
'bootstrap': bootstrap,
'criterion':['gini','entropy']}
rf = RandomForestClassifier()
rf_random = RandomizedSearchCV(estimator = rf, param_distributions = random_grid, cv = 5, verbose=2, random_state=90, n_jobs = -1)
rf_random.fit(X_over, y_over)
rf_random.best_params_
Fitting 5 folds for each of 10 candidates, totalling 50 fits
{'n_estimators': 463,
'min_samples_split': 82,
'min_samples_leaf': 1,
'max_features': 'sqrt',
'max_depth': 110,
'criterion': 'gini',
'bootstrap': False}
rf_grid1 = RandomForestClassifier(n_estimators=463,
min_samples_split= 82,
min_samples_leaf=1,
max_features= 'sqrt',
max_depth= 110,
criterion= 'gini',
bootstrap= False)
rf_grid1.fit(X_over, y_over)
RandomForestClassifier(bootstrap=False, max_depth=110, max_features='sqrt',
min_samples_split=82, n_estimators=463)
modelrfg1_score=rf_grid1.score(X_over,y_over)
print('Accuracy Score of Training Data: ', modelrfg1_score)
Accuracy Score of Training Data: 0.9980443285528031
y_predictrfg1= rf_grid1.predict(X_test)
modelrfg1_score = accuracy_score(Y_test, y_predictrfg1)
print('Accuracy Score of Test Data:', modelrfg1_score)
Accuracy Score of Test Data: 0.9341825902335457
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictrfg1, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.00 0.00 0.00 31
0 0.93 1.00 0.97 440
accuracy 0.93 471
macro avg 0.47 0.50 0.48 471
weighted avg 0.87 0.93 0.90 471
#Plotting ROC and AUC
probs = rf_grid1.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_rfo = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='RF over sampled (AUC = %0.2f)' % roc_auc_rfo)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 33 0.388636 0.612903 0.611364 0.00154 0.165661
# store the predicted probabilities for failed class
y_pred_prob = rf_grid1.predict_proba(X_test)[:, 1]
# predict fail if the predicted probability is greater than 0.1656
from sklearn.preprocessing import binarize
y_pred_class = binarize([y_pred_prob], 0.1656)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.10 0.61 0.17 31
0 0.96 0.61 0.75 440
accuracy 0.61 471
macro avg 0.53 0.61 0.46 471
weighted avg 0.90 0.61 0.71 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for RF over', fontsize = 15);
precision_rfo, recall_rfo, f1_score_rfo, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_rfo)
print('Recall Score :', '%0.2f' % recall_rfo)
print('F1-Score:', '%0.2f' % f1_score_rfo)
rfo_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % rfo_acc)
print('AUC :','%0.2f' % roc_auc_rfo)
print('Threshold :','%0.2f' % 0.1656)
Thresholdrf=0.1656
Precision Score : 0.53 Recall Score : 0.61 F1-Score: 0.46 Accuracy Score : 0.61 AUC : 0.64 Threshold : 0.17
xgb_para = {"learning_rate" : [0.05, 0.10, 0.15, 0.20, 0.25, 0.30 ] ,
"max_depth" : [ 3, 4, 5, 6, 8, 10, 12, 15],
"min_child_weight" : [ 1, 3, 5, 7 ],
"gamma" : [ 0.0, 0.1, 0.2 , 0.3, 0.4 ],
"colsample_bytree" : [ 0.3, 0.4, 0.5 , 0.7 ]
}
xgb = XGBClassifier()
xgb_hy = RandomizedSearchCV(estimator = xgb, param_distributions = xgb_para, cv = 5, verbose=2, random_state=25, n_jobs = -1)
xgb_hy.fit(X_adasyn, y_adasyn)
xgb_hy.best_params_
Fitting 5 folds for each of 10 candidates, totalling 50 fits
{'min_child_weight': 5,
'max_depth': 15,
'learning_rate': 0.2,
'gamma': 0.1,
'colsample_bytree': 0.3}
xgb=XGBClassifier(min_child_weight=5,
max_depth=15,
learning_rate= 0.2,
gamma= 0.1,
colsample_bytree=0.3)
xgb.fit(X_adasyn,y_adasyn)
XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
colsample_bynode=1, colsample_bytree=0.3, gamma=0.1, gpu_id=-1,
importance_type='gain', interaction_constraints='',
learning_rate=0.2, max_delta_step=0, max_depth=15,
min_child_weight=5, missing=nan, monotone_constraints='()',
n_estimators=100, n_jobs=8, num_parallel_tree=1, random_state=0,
reg_alpha=0, reg_lambda=1, scale_pos_weight=1, subsample=1,
tree_method='exact', validate_parameters=1, verbosity=None)
modelxgb_score=xgb.score(X_adasyn,y_adasyn)
print('Accuracy Score of Training Data: ', modelxgb_score)
Accuracy Score of Training Data: 1.0
y_predictxg= xgb.predict(X_test)
modelxg_score = accuracy_score(Y_test, y_predictxg)
print('Accuracy Score of Test Data:', modelxg_score)
Accuracy Score of Test Data: 0.9299363057324841
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictxg, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.00 0.00 0.00 31
0 0.93 1.00 0.96 440
accuracy 0.93 471
macro avg 0.47 0.50 0.48 471
weighted avg 0.87 0.93 0.90 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictxg)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.xlabel('Actual Classes', fontsize = 15)
plt.ylabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for XGB ADASYN', fontsize = 15);
#Plotting ROC and AUC
probs = xgb.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_xg = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='XGB ADASYN sampled (AUC = %0.2f)' % roc_auc_xg)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 29 0.490909 0.516129 0.509091 0.007038 0.019092
# store the predicted probabilities for failed class
y_pred_prob = xgb.predict_proba(X_test)[:, 1]
# predict fail if the predicted probability is greater than 0.0190
from sklearn.preprocessing import binarize
y_pred_class = binarize([y_pred_prob], 0.0190)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.07 0.52 0.12 31
0 0.94 0.51 0.66 440
accuracy 0.51 471
macro avg 0.50 0.51 0.39 471
weighted avg 0.88 0.51 0.62 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for XGB over', fontsize = 15);
precision_xg, recall_xg, f1_score_xg, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_xg)
print('Recall Score :', '%0.2f' % recall_xg)
print('F1-Score:', '%0.2f' % f1_score_xg)
xg_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % xg_acc)
print('AUC :','%0.2f' % roc_auc_xg)
print('Threshold :','%0.2f' % 0.0190)
Thresholdxg=0.0190
Precision Score : 0.50 Recall Score : 0.51 F1-Score: 0.39 Accuracy Score : 0.51 AUC : 0.57 Threshold : 0.02
modellists = []
modellists.append(['Logistic Normal Data', lg_acc * 100, recall_lg * 100, precision_lg * 100,roc_auc_lg*100,f1_score_lg*100,Thresholdlo])
modellists.append(['SVM Under sampled data', sv_acc* 100, recall_sv * 100, precision_sv* 100,roc_auc_sv*100,f1_score_sv*100,Thresholdsv])
modellists.append(['LGBM Smote sampled Data', lgm_acc * 100, recall_lgm * 100, precision_lgm * 100,roc_auc_lg*100,f1_score_lg*100,Thresholdlg])
modellists.append(['Random Forest Over sampled Data', rfo_acc * 100, recall_rfo * 100, precision_rfo * 100,roc_auc_rfo*100,f1_score_rfo*100,Thresholdrf])
modellists.append(['XGboost ADASYN sampled Data', xg_acc * 100, recall_xg * 100, precision_xg * 100,roc_auc_xg*100,f1_score_xg*100,Thresholdxg])
model_df = pd.DataFrame(modellists, columns = ['Model', 'Accuracy Scores on Test', 'Recall Score', 'Precision Score','AUC','F1 Score','Threshold'])
model_df
| Model | Accuracy Scores on Test | Recall Score | Precision Score | AUC | F1 Score | Threshold | |
|---|---|---|---|---|---|---|---|
| 0 | Logistic Normal Data | 61.571125 | 61.440616 | 52.933050 | 65.549853 | 46.158510 | 0.0376 |
| 1 | SVM Under sampled data | 63.694268 | 64.076246 | 53.665521 | 70.916422 | 47.782367 | 0.3555 |
| 2 | LGBM Smote sampled Data | 54.352442 | 54.578446 | 51.132284 | 65.549853 | 46.158510 | 0.0444 |
| 3 | Random Forest Over sampled Data | 61.146497 | 61.213343 | 52.864769 | 63.548387 | 45.906578 | 0.1656 |
| 4 | XGboost ADASYN sampled Data | 50.955414 | 51.260997 | 50.310201 | 57.184751 | 39.073341 | 0.0190 |
#making copies of validation dataset
val_pca=pd.DataFrame(val_pca)
val1=val_pca.copy()
val1=pd.DataFrame(val1)
val1=val1.reset_index(drop=True)
val2=val_pca.copy()
val2=pd.DataFrame(val2)
val2=val2.reset_index(drop=True)
svc_cv.fit(X_under,y_under)
pred=svc_cv.predict(val1)
val1['Pass/Fail'] = pred
val1 = val1[(val1['Pass/Fail'] == 1)]
val1.head(18)
| 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2 | 0.278925 | 1.937571 | -0.139066 | -0.186996 | 0.000854 | 0.101144 | -0.758327 | -1.667386 | -2.252920 | 2.142752 | 2.455236 | 0.437368 | -2.035598 | -0.054022 | 1.944307 | -0.535901 | -0.304464 | 2.705936 | 0.239632 | -1.213717 | 0.620981 | -0.030675 | 2.557970 | -1.608922 | 0.505562 | 3.201052 | -0.882069 | -4.635498 | -0.770801 | -3.574428 | 1.330702 | 3.902943 | 2.544027 | -0.020070 | -3.531717 | -3.183553 | -0.405590 | 1.625177 | 2.049291 | -2.147744 | -3.378837 | -1.741670 | -3.401969 | -1.744007 | -0.646464 | -4.285466 | -0.772173 | 3.050395 | -4.472710 | 1.193505 | -1.530343 | 0.710492 | 0.169889 | -0.122648 | 1.196557 | -1.207668 | -1.335390 | -1.998037 | 0.611191 | -0.713760 | -1.868101 | 1.471241 | 2.605144 | -0.653198 | 1.621636 | 0.371693 | 1.070052 | 2.163889 | 0.323717 | 0.935321 | -0.419916 | -0.187361 | -0.085901 | 0.242981 | 1.466339 | 0.891498 | 0.798790 | 0.207819 | 0.591670 | -0.263704 | -2.194396 | 1.266578 | -1.274232 | -2.931262 | -0.629852 | 0.211246 | 0.946987 | -0.662682 | 0.538487 | 1.900983 | -1.101718 | 1.329419 | 0.776497 | 1.490126 | -2.773633 | 0.639470 | -1.424861 | -2.999525 | 1.983159 | 1.502644 | 1.079523 | 1.235522 | 0.029121 | -1.552666 | 0.301060 | -0.939610 | -2.913030 | 2.102846 | 1.776076 | 0.840810 | 0.068449 | -0.249979 | 1.426895 | -0.018327 | -1.431741 | -4.153746 | 1.564734 | -0.285376 | 0.415364 | -1.171505 | -0.314343 | 1.641471 | -1.496775 | -0.397810 | 1.671291 | -0.113252 | 1.322176 | -0.043678 | -1.172433 | 0.357317 | -0.502624 | -0.644933 | -0.775789 | 0.750153 | -0.516105 | 0.846653 | -0.515951 | 0.021140 | -0.401219 | -1.616632 | 0.499506 | 0.313672 | -0.282122 | -0.159237 | 0.087614 | 0.054005 | -1.335955 | -1.321001 | 0.542853 | -0.305492 | 0.320494 | 0.680499 | -1.541633 | 0.297344 | 0.379577 | 0.463219 | -0.983617 | -0.388313 | 1 |
| 4 | 0.308119 | 4.417323 | 1.840441 | 0.612236 | 0.162877 | 0.119498 | 0.332537 | -0.272154 | 2.250630 | 3.170334 | 0.735757 | -3.247212 | 2.283609 | 1.366154 | -4.941261 | -1.902437 | 0.676186 | -1.913984 | 0.035088 | -4.566823 | -0.578037 | -2.066076 | 0.449146 | -4.381404 | 0.436853 | -4.109056 | 1.588612 | 3.023159 | 0.173118 | -4.866134 | -7.414828 | 1.337723 | -2.863879 | 0.229513 | 2.787015 | 3.748061 | 4.825462 | -4.503855 | 2.089468 | -3.219903 | 2.476945 | -6.941446 | -2.260231 | 3.426524 | -4.170843 | -0.141279 | -0.641335 | -0.327460 | -1.765326 | -2.993638 | -2.476690 | -10.101094 | 6.858996 | 0.445752 | 2.051620 | 0.507909 | -2.420322 | 2.755170 | -1.849488 | 13.599848 | 5.273400 | -2.512991 | 0.035695 | 3.768467 | 8.222702 | -5.794324 | -3.187681 | 3.792847 | 6.092398 | 8.778741 | 11.070746 | 1.825563 | -5.218055 | -2.209795 | -6.929549 | 10.212297 | 9.075266 | -5.313948 | 8.404048 | 6.046228 | -2.537100 | 0.770576 | 5.922444 | -1.991262 | 0.465978 | 1.928767 | -2.599812 | -3.144039 | -2.765545 | 1.326886 | -1.005208 | 1.282151 | -1.728543 | -0.974224 | -0.324821 | 0.286365 | -2.987764 | 2.571216 | -0.641733 | -2.284347 | -0.758108 | 4.137175 | -0.308630 | 3.971004 | -0.682584 | 2.246494 | 0.805599 | -1.639389 | -0.727747 | 1.209131 | -1.099584 | -1.161394 | -0.430024 | 0.640492 | 1.555746 | -0.671016 | -0.242333 | -2.873451 | 0.895486 | 0.451472 | 2.481022 | 1.253874 | 0.274337 | -0.417389 | -1.407498 | 0.218916 | -0.913272 | 0.367293 | -0.735490 | 0.360974 | -0.152363 | -0.637721 | -0.130674 | 0.093623 | 0.761311 | -0.511164 | -1.005618 | 0.254734 | 1.011350 | 1.402527 | -0.444417 | 0.437472 | 0.732216 | 0.308114 | -0.070026 | 0.478572 | -0.986544 | -0.796664 | 1.640639 | 1.167493 | 0.632968 | -0.633622 | -0.639812 | -0.526713 | -0.825353 | -0.191233 | -0.424047 | 0.050306 | 1 |
| 5 | 1.866189 | 3.413541 | 1.280414 | 0.024596 | 0.847482 | 0.911638 | 0.225781 | -0.192407 | -2.205701 | 2.990652 | -1.594787 | -1.807544 | 0.502865 | -0.602549 | 0.167212 | 0.320897 | -0.747420 | 0.404667 | 4.014737 | -1.004617 | -1.600045 | 2.139204 | -0.058319 | 0.121112 | -1.332945 | 2.908303 | -0.407453 | -0.649338 | 3.115151 | -3.150124 | -0.420361 | 2.155023 | 1.917009 | -3.709007 | -1.768021 | 3.180375 | 0.698044 | -1.306607 | -0.851179 | 2.952410 | -1.157035 | 1.549897 | 1.461619 | 1.524106 | -0.826336 | 1.727414 | -2.450334 | -0.643661 | 2.188123 | -0.957218 | -1.597890 | 2.461863 | -2.976806 | -2.454712 | 1.620904 | 0.854944 | -3.640088 | 1.098793 | -0.552830 | 1.336554 | 0.523440 | -1.397152 | 0.142742 | -0.547898 | 1.320479 | -1.933428 | 0.980043 | -0.784491 | 0.176003 | 2.815363 | -0.214836 | 0.704518 | 0.269887 | -0.586680 | -0.186849 | 0.179942 | -1.634708 | -0.359810 | -2.063449 | -1.351895 | -2.015269 | -0.109918 | -0.256827 | 2.321290 | 0.489488 | -0.151504 | 0.298924 | -0.012004 | 0.648848 | 1.837169 | -1.600232 | 1.653920 | 0.409613 | -0.983629 | 1.495169 | -0.274762 | -0.031435 | 1.105462 | -0.718269 | -0.412102 | 0.540110 | 2.559888 | -0.829652 | -0.707493 | 0.104479 | -0.419933 | -1.017436 | -0.028433 | -0.064425 | 0.903526 | -1.757463 | 0.004265 | -0.583221 | -0.270954 | -1.439817 | -1.999746 | 1.360711 | 0.792390 | -0.140518 | 0.051427 | -0.777022 | 0.221054 | -0.818494 | -1.272190 | -0.570873 | 0.667771 | -0.128567 | 0.293823 | 0.739555 | 1.517655 | 0.699982 | 0.374786 | 0.740316 | 0.645759 | -0.662840 | -0.980265 | 0.128214 | 1.134974 | -0.136511 | 0.585354 | -1.798005 | -0.647219 | -0.781005 | 1.214993 | 1.431574 | 0.048713 | -1.193782 | 1.362363 | -0.415005 | -0.292285 | -0.674801 | 0.179991 | -0.709261 | -1.810861 | 0.511374 | 0.791150 | 0.696139 | -0.002840 | 1 |
| 14 | -2.374126 | 22.693246 | -22.255507 | 5.319930 | -4.994762 | 4.995794 | -0.163109 | 0.152872 | -4.272536 | -4.377114 | -2.223973 | -0.012304 | 1.094292 | 0.334486 | -4.448161 | -2.564457 | -1.287455 | -1.127504 | 2.903620 | 1.479645 | 0.288699 | 3.548992 | -2.552571 | 2.214341 | 1.834440 | 0.961153 | -0.380428 | 2.743778 | -0.402085 | 2.854520 | 1.625674 | -0.660409 | 3.713644 | -2.607831 | -1.006163 | 2.044437 | 0.715431 | -0.660141 | -0.913560 | -2.475820 | 0.404336 | -0.447138 | -4.346282 | -0.619852 | 1.686708 | -3.914504 | 4.049237 | 1.568886 | 1.217172 | 4.567500 | -4.212582 | -1.454013 | 4.940804 | -0.653738 | 6.554161 | 1.056602 | 0.901673 | 4.813135 | 2.669863 | -0.214445 | -1.481135 | -5.861116 | 0.570929 | -0.504327 | 1.763310 | 0.644947 | 0.721212 | -0.726589 | -0.622338 | -1.595334 | -1.067218 | 0.933752 | -3.239903 | -1.177951 | -2.257086 | -1.035539 | -5.759324 | -1.645341 | 1.776213 | -2.165516 | 1.545621 | -0.674933 | -0.108013 | 1.227573 | -1.286568 | -1.118812 | -1.475431 | 0.637640 | 1.552012 | 0.091567 | -0.122894 | 0.202455 | 0.420961 | 0.318991 | 0.975163 | -0.643336 | -2.718410 | 1.085262 | 0.983388 | -0.163329 | 0.279686 | -1.605667 | -0.882556 | -0.023623 | 2.870387 | -0.874302 | -0.326311 | 2.454123 | 1.440013 | 0.002438 | 1.344371 | -1.664960 | -0.185915 | 0.291363 | -1.635758 | 2.327299 | 0.557720 | 0.063808 | 1.060982 | -1.824013 | -0.015150 | 1.997989 | -1.919503 | -0.580002 | 0.596915 | -1.354827 | -0.507173 | -0.799870 | 0.253398 | -1.265421 | 0.511679 | -2.339331 | 0.983946 | 0.955528 | 0.515428 | -0.993575 | 0.938440 | -0.796613 | 0.736832 | 1.525735 | -1.573658 | 1.317566 | -1.759656 | -2.086890 | -0.710828 | 0.955384 | 2.503658 | 1.367572 | 3.855194 | -1.662581 | 3.479641 | 1.380408 | -1.289141 | -1.354862 | -1.135402 | 0.249233 | -0.616439 | 0.062429 | 1 |
| 15 | -2.298516 | 19.865789 | -17.993050 | 3.776964 | -4.093476 | 2.331331 | 0.439079 | -0.053698 | -3.233229 | -5.507317 | -1.325462 | -0.735285 | 2.420270 | -0.003283 | -3.827074 | -2.676825 | -1.959222 | 0.414136 | 2.507113 | -0.941510 | -0.865092 | 5.156960 | -3.463763 | 1.798822 | 0.210368 | -2.300466 | 0.045526 | 3.830971 | 0.209051 | 1.969175 | 2.368265 | -0.247671 | 3.756424 | -3.169911 | -0.357184 | 3.251802 | 1.507800 | -3.192979 | 0.767076 | -2.187417 | 0.678654 | 0.602860 | -6.132421 | -1.047581 | 1.633582 | -2.715821 | 5.409552 | 1.283143 | -0.132249 | 6.095938 | -4.138871 | -1.214501 | 8.086937 | -1.901384 | 6.798099 | -0.071173 | 3.681828 | 6.485591 | 2.452330 | -1.172071 | -3.144859 | -5.901689 | -1.116416 | -0.407399 | 2.379613 | -2.125138 | 2.559780 | 0.339292 | -2.182469 | -1.399273 | 0.341084 | 4.705881 | -3.915652 | -1.757349 | -1.385057 | -0.752720 | -6.283994 | -1.266796 | 2.238572 | -2.548664 | 1.837513 | 0.211306 | -1.736292 | -0.117958 | 0.936054 | -0.542998 | -2.436415 | 1.359733 | -0.721729 | 1.058691 | -2.261174 | 0.353583 | 0.821764 | 0.236032 | -1.187876 | -1.704876 | -1.054848 | 1.709939 | 1.377795 | 1.896296 | 1.606122 | -1.011592 | -0.571097 | -1.714601 | 1.161605 | -0.732492 | -1.166245 | 1.707696 | 0.833719 | 0.760004 | 0.200395 | -1.570417 | -1.087655 | 1.649829 | -1.679480 | 1.691316 | -0.134593 | -1.207867 | 2.884035 | -0.027486 | -0.468792 | 1.294946 | -1.791823 | -0.720031 | -0.762934 | -1.425091 | 0.674799 | -0.138870 | 0.271059 | -1.219723 | 0.555672 | -0.080559 | 1.612882 | 0.697683 | -0.644446 | 0.915810 | -0.179973 | -1.390245 | 0.558863 | -0.113259 | -1.199698 | 0.660199 | 1.213656 | -1.461542 | 0.182516 | 1.254318 | 1.590712 | 2.536058 | 0.229059 | -1.465241 | 0.712629 | 2.813534 | -0.217564 | -1.726012 | -1.379303 | 1.357969 | -0.492668 | 0.044341 | 1 |
#fitting SVM with threshold
# store the predicted probabilities for failed class
y_pred_prob = svc_cv.predict_proba(val2)[:, 1]
# predict failed if the predicted probability is greater than 0.3555
pred= binarize([y_pred_prob], 0.3555)[0]
val2['Pass/Fail'] = pred
val2 = val2[(val2['Pass/Fail'] == 1)]
val2.head(19)
| 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2 | 0.278925 | 1.937571 | -0.139066 | -0.186996 | 0.000854 | 0.101144 | -0.758327 | -1.667386 | -2.252920 | 2.142752 | 2.455236 | 0.437368 | -2.035598 | -0.054022 | 1.944307 | -0.535901 | -0.304464 | 2.705936 | 0.239632 | -1.213717 | 0.620981 | -0.030675 | 2.557970 | -1.608922 | 0.505562 | 3.201052 | -0.882069 | -4.635498 | -0.770801 | -3.574428 | 1.330702 | 3.902943 | 2.544027 | -0.020070 | -3.531717 | -3.183553 | -0.405590 | 1.625177 | 2.049291 | -2.147744 | -3.378837 | -1.741670 | -3.401969 | -1.744007 | -0.646464 | -4.285466 | -0.772173 | 3.050395 | -4.472710 | 1.193505 | -1.530343 | 0.710492 | 0.169889 | -0.122648 | 1.196557 | -1.207668 | -1.335390 | -1.998037 | 0.611191 | -0.713760 | -1.868101 | 1.471241 | 2.605144 | -0.653198 | 1.621636 | 0.371693 | 1.070052 | 2.163889 | 0.323717 | 0.935321 | -0.419916 | -0.187361 | -0.085901 | 0.242981 | 1.466339 | 0.891498 | 0.798790 | 0.207819 | 0.591670 | -0.263704 | -2.194396 | 1.266578 | -1.274232 | -2.931262 | -0.629852 | 0.211246 | 0.946987 | -0.662682 | 0.538487 | 1.900983 | -1.101718 | 1.329419 | 0.776497 | 1.490126 | -2.773633 | 0.639470 | -1.424861 | -2.999525 | 1.983159 | 1.502644 | 1.079523 | 1.235522 | 0.029121 | -1.552666 | 0.301060 | -0.939610 | -2.913030 | 2.102846 | 1.776076 | 0.840810 | 0.068449 | -0.249979 | 1.426895 | -0.018327 | -1.431741 | -4.153746 | 1.564734 | -0.285376 | 0.415364 | -1.171505 | -0.314343 | 1.641471 | -1.496775 | -0.397810 | 1.671291 | -0.113252 | 1.322176 | -0.043678 | -1.172433 | 0.357317 | -0.502624 | -0.644933 | -0.775789 | 0.750153 | -0.516105 | 0.846653 | -0.515951 | 0.021140 | -0.401219 | -1.616632 | 0.499506 | 0.313672 | -0.282122 | -0.159237 | 0.087614 | 0.054005 | -1.335955 | -1.321001 | 0.542853 | -0.305492 | 0.320494 | 0.680499 | -1.541633 | 0.297344 | 0.379577 | 0.463219 | -0.983617 | -0.388313 | 1.0 |
| 3 | 0.528330 | 2.175852 | -1.693965 | 2.652302 | -0.684325 | -0.241593 | -5.150900 | -2.631122 | -0.789310 | 19.651834 | 6.584746 | -16.192337 | 2.054759 | 4.486971 | -3.775697 | -2.448522 | 8.783194 | -3.880678 | 0.783646 | -3.280692 | -2.298974 | 3.412294 | -0.903023 | -3.117263 | -2.773727 | -1.780158 | -3.333509 | -4.022592 | -1.704069 | -2.383513 | 7.680935 | 3.381884 | 5.684561 | -1.292853 | 1.264532 | -3.360697 | 1.710045 | -0.020923 | 4.460400 | -2.690661 | -2.820947 | -4.580525 | -3.913600 | 2.091732 | -0.830464 | 2.596260 | -0.334023 | 2.606263 | 2.086205 | 3.209863 | 0.148541 | 6.099810 | 6.681021 | 0.368808 | -2.113174 | 0.022069 | 2.088355 | -6.347612 | -2.159596 | 0.143957 | -7.622815 | 6.896746 | 4.403907 | 7.730554 | -7.672943 | 2.169896 | 1.905398 | 3.434648 | -0.330636 | -0.113292 | -4.492562 | 11.265915 | 3.663364 | 5.387365 | -7.471309 | -2.904263 | 6.923769 | -2.543592 | -6.120720 | -3.309150 | 1.210757 | -6.756510 | 7.148960 | -1.922062 | -4.858350 | 3.975204 | -6.068748 | 1.300378 | 5.254519 | 4.538226 | 6.043878 | -2.433198 | 1.298519 | -0.626323 | -1.529590 | 3.689458 | -1.145907 | 6.676622 | 1.218922 | -2.460351 | 4.650600 | 1.157004 | -6.375441 | 1.513282 | 5.271153 | 1.621288 | 0.681633 | -3.320575 | 4.437937 | -1.564535 | 0.492631 | -0.673340 | 0.556452 | -1.405336 | 3.176196 | 2.889401 | -1.128972 | -1.223472 | -3.604625 | 1.776693 | -0.635190 | -0.197444 | -2.719597 | 0.399110 | -1.806255 | 2.782734 | 0.849430 | 0.913012 | -0.980631 | 0.616298 | -1.662987 | 2.789860 | 2.513476 | -0.468072 | -0.405473 | -1.093192 | 2.030645 | -1.103873 | 0.848213 | -0.221899 | -0.864103 | 1.185704 | 1.124685 | 0.812065 | 1.361929 | -0.194173 | 0.276197 | 1.146809 | 0.207608 | 1.374964 | -0.104988 | 0.287760 | -0.351112 | -0.514601 | 0.171187 | 0.720761 | -0.274476 | 0.216156 | 1.0 |
| 4 | 0.308119 | 4.417323 | 1.840441 | 0.612236 | 0.162877 | 0.119498 | 0.332537 | -0.272154 | 2.250630 | 3.170334 | 0.735757 | -3.247212 | 2.283609 | 1.366154 | -4.941261 | -1.902437 | 0.676186 | -1.913984 | 0.035088 | -4.566823 | -0.578037 | -2.066076 | 0.449146 | -4.381404 | 0.436853 | -4.109056 | 1.588612 | 3.023159 | 0.173118 | -4.866134 | -7.414828 | 1.337723 | -2.863879 | 0.229513 | 2.787015 | 3.748061 | 4.825462 | -4.503855 | 2.089468 | -3.219903 | 2.476945 | -6.941446 | -2.260231 | 3.426524 | -4.170843 | -0.141279 | -0.641335 | -0.327460 | -1.765326 | -2.993638 | -2.476690 | -10.101094 | 6.858996 | 0.445752 | 2.051620 | 0.507909 | -2.420322 | 2.755170 | -1.849488 | 13.599848 | 5.273400 | -2.512991 | 0.035695 | 3.768467 | 8.222702 | -5.794324 | -3.187681 | 3.792847 | 6.092398 | 8.778741 | 11.070746 | 1.825563 | -5.218055 | -2.209795 | -6.929549 | 10.212297 | 9.075266 | -5.313948 | 8.404048 | 6.046228 | -2.537100 | 0.770576 | 5.922444 | -1.991262 | 0.465978 | 1.928767 | -2.599812 | -3.144039 | -2.765545 | 1.326886 | -1.005208 | 1.282151 | -1.728543 | -0.974224 | -0.324821 | 0.286365 | -2.987764 | 2.571216 | -0.641733 | -2.284347 | -0.758108 | 4.137175 | -0.308630 | 3.971004 | -0.682584 | 2.246494 | 0.805599 | -1.639389 | -0.727747 | 1.209131 | -1.099584 | -1.161394 | -0.430024 | 0.640492 | 1.555746 | -0.671016 | -0.242333 | -2.873451 | 0.895486 | 0.451472 | 2.481022 | 1.253874 | 0.274337 | -0.417389 | -1.407498 | 0.218916 | -0.913272 | 0.367293 | -0.735490 | 0.360974 | -0.152363 | -0.637721 | -0.130674 | 0.093623 | 0.761311 | -0.511164 | -1.005618 | 0.254734 | 1.011350 | 1.402527 | -0.444417 | 0.437472 | 0.732216 | 0.308114 | -0.070026 | 0.478572 | -0.986544 | -0.796664 | 1.640639 | 1.167493 | 0.632968 | -0.633622 | -0.639812 | -0.526713 | -0.825353 | -0.191233 | -0.424047 | 0.050306 | 1.0 |
| 5 | 1.866189 | 3.413541 | 1.280414 | 0.024596 | 0.847482 | 0.911638 | 0.225781 | -0.192407 | -2.205701 | 2.990652 | -1.594787 | -1.807544 | 0.502865 | -0.602549 | 0.167212 | 0.320897 | -0.747420 | 0.404667 | 4.014737 | -1.004617 | -1.600045 | 2.139204 | -0.058319 | 0.121112 | -1.332945 | 2.908303 | -0.407453 | -0.649338 | 3.115151 | -3.150124 | -0.420361 | 2.155023 | 1.917009 | -3.709007 | -1.768021 | 3.180375 | 0.698044 | -1.306607 | -0.851179 | 2.952410 | -1.157035 | 1.549897 | 1.461619 | 1.524106 | -0.826336 | 1.727414 | -2.450334 | -0.643661 | 2.188123 | -0.957218 | -1.597890 | 2.461863 | -2.976806 | -2.454712 | 1.620904 | 0.854944 | -3.640088 | 1.098793 | -0.552830 | 1.336554 | 0.523440 | -1.397152 | 0.142742 | -0.547898 | 1.320479 | -1.933428 | 0.980043 | -0.784491 | 0.176003 | 2.815363 | -0.214836 | 0.704518 | 0.269887 | -0.586680 | -0.186849 | 0.179942 | -1.634708 | -0.359810 | -2.063449 | -1.351895 | -2.015269 | -0.109918 | -0.256827 | 2.321290 | 0.489488 | -0.151504 | 0.298924 | -0.012004 | 0.648848 | 1.837169 | -1.600232 | 1.653920 | 0.409613 | -0.983629 | 1.495169 | -0.274762 | -0.031435 | 1.105462 | -0.718269 | -0.412102 | 0.540110 | 2.559888 | -0.829652 | -0.707493 | 0.104479 | -0.419933 | -1.017436 | -0.028433 | -0.064425 | 0.903526 | -1.757463 | 0.004265 | -0.583221 | -0.270954 | -1.439817 | -1.999746 | 1.360711 | 0.792390 | -0.140518 | 0.051427 | -0.777022 | 0.221054 | -0.818494 | -1.272190 | -0.570873 | 0.667771 | -0.128567 | 0.293823 | 0.739555 | 1.517655 | 0.699982 | 0.374786 | 0.740316 | 0.645759 | -0.662840 | -0.980265 | 0.128214 | 1.134974 | -0.136511 | 0.585354 | -1.798005 | -0.647219 | -0.781005 | 1.214993 | 1.431574 | 0.048713 | -1.193782 | 1.362363 | -0.415005 | -0.292285 | -0.674801 | 0.179991 | -0.709261 | -1.810861 | 0.511374 | 0.791150 | 0.696139 | -0.002840 | 1.0 |
| 7 | 1.168623 | 3.900380 | 3.097619 | -0.192266 | 0.965617 | -1.161749 | 0.039543 | -1.324696 | -3.811881 | 0.433374 | 0.815280 | -1.163294 | 0.663635 | 0.687997 | -1.828656 | 4.009147 | -0.210482 | -0.045743 | 2.135265 | 1.348963 | -0.130219 | 0.717701 | 1.043642 | 1.375538 | -3.136696 | 1.984897 | 0.302528 | -1.160863 | 0.146187 | -5.080017 | -1.871093 | 1.853476 | 3.441562 | 0.791292 | -1.661566 | 0.161830 | 1.530532 | -0.535285 | -0.050537 | -1.093585 | 1.078996 | -6.385519 | 2.032684 | -1.664431 | -1.745086 | 1.477541 | 0.831478 | 0.721552 | 3.209175 | 0.812618 | 0.145922 | 1.362514 | 2.156142 | -0.312656 | 2.942865 | -1.134721 | 1.038404 | 0.206656 | 1.623940 | -0.526976 | -1.944178 | -3.143453 | 0.499006 | 0.336512 | 0.587680 | 0.050992 | -1.125899 | 1.023488 | 1.003750 | 0.718975 | 0.226176 | 1.429470 | 1.226173 | 1.639580 | 1.408310 | -0.122895 | 0.749947 | 0.814940 | -1.554777 | -0.375690 | -0.182908 | 0.934121 | -1.367662 | 1.378814 | -0.482986 | -1.153664 | -0.388645 | -1.622360 | -0.375816 | -0.901773 | 0.063726 | 1.227589 | 0.769528 | -0.014012 | 2.129246 | 0.945608 | -0.046982 | -0.850965 | -0.151604 | -0.761177 | 0.065004 | -0.945815 | 0.121885 | -0.308468 | 0.560795 | 0.187267 | -0.249455 | 2.080213 | -1.538200 | 0.486553 | 0.080938 | 0.401918 | -1.454636 | -2.464173 | -0.360385 | -0.539994 | 0.432436 | -0.059506 | 0.965944 | 0.449490 | -0.475849 | 0.244127 | -0.120131 | 0.505307 | -0.493002 | 0.479777 | -0.110761 | -0.677929 | -0.257084 | 1.046091 | -0.316583 | 0.128284 | -1.433537 | 1.502936 | 0.516913 | 0.866216 | -0.140878 | 0.283665 | 0.181815 | 0.484135 | -0.164604 | -0.628760 | 0.232646 | 0.507231 | 0.359765 | -0.199994 | 0.634152 | -0.177741 | 0.068975 | -0.077030 | -0.547687 | 0.074957 | 1.086659 | 2.194832 | -0.755062 | 0.624308 | 0.416047 | -1.310204 | 1.0 |
| 8 | -0.294887 | 13.726887 | -11.816206 | 4.139152 | -2.239349 | 1.806618 | 0.003780 | -0.086216 | -4.511013 | -2.670224 | -3.636173 | -2.847370 | 1.296898 | -1.591892 | 0.905381 | -0.509962 | -1.270921 | 0.131635 | 2.397606 | 0.679174 | -3.246289 | 0.908714 | 0.818152 | -1.116271 | -1.213521 | 1.303937 | -1.073792 | 0.294846 | 1.484703 | -0.972360 | -0.142273 | -1.083086 | 2.691966 | 0.318096 | -0.915074 | 2.225298 | -0.558325 | 2.192987 | -0.553005 | 1.105839 | -0.680402 | -1.328850 | 1.745787 | -0.654331 | 0.539542 | -0.263689 | 0.658851 | -0.766411 | -0.302445 | -1.671178 | -0.076156 | 1.757538 | -0.943251 | -4.679336 | -2.026791 | -1.541749 | -1.455495 | -0.303206 | -0.238619 | 0.547587 | 0.548180 | -0.319087 | -0.097060 | 0.973057 | 1.318509 | -1.461960 | -0.455981 | 2.110328 | -0.306293 | 0.991501 | 0.479910 | 2.815257 | 0.305519 | -0.262422 | 1.724774 | -0.162513 | 0.903562 | 0.847115 | -1.322962 | -0.764729 | -1.377228 | -0.720536 | -0.996233 | 0.792236 | 2.701025 | -2.142353 | -0.468089 | -0.663239 | 0.621936 | 1.314749 | -1.468313 | -0.312233 | 0.257889 | -0.793861 | 1.605572 | -1.367161 | -1.556833 | 0.993557 | -0.386496 | -0.918719 | 0.644408 | 0.745012 | -0.759302 | 0.073705 | 0.329016 | -1.055321 | -0.518678 | -1.862801 | -0.858373 | 0.201710 | -0.845145 | -0.978179 | -0.900391 | -0.388319 | 0.303543 | -1.593939 | 0.192468 | 0.667865 | -0.376791 | 0.141719 | -1.284887 | 0.297977 | 0.340573 | 0.659105 | -0.299395 | 1.403168 | -0.310697 | -0.641840 | -1.195632 | 0.809196 | 0.175294 | 0.276948 | -0.352128 | 0.629690 | -1.342648 | 0.339207 | 0.551416 | 0.502375 | -0.105722 | -1.211422 | 0.580402 | -1.129279 | 1.270492 | 0.359406 | 0.944292 | -0.482716 | -1.877465 | 0.484349 | -1.649461 | -0.934253 | -0.213819 | -0.117688 | 0.017343 | -0.346822 | 0.691373 | 0.654144 | 0.268862 | 0.377837 | 1.0 |
| 10 | -1.479186 | 15.572710 | -17.053582 | 5.611517 | -0.799346 | 0.740382 | -1.572220 | -1.005263 | -3.796211 | -4.451247 | 1.176788 | -2.157721 | -3.113906 | 0.161259 | 0.451407 | -1.852420 | -1.197896 | 2.019934 | 1.465418 | 3.805136 | -3.366788 | 2.047858 | 0.545168 | 4.352668 | -1.773404 | 0.905198 | 1.342876 | -5.114600 | 0.800891 | -1.517917 | 1.222654 | 1.387624 | 1.431195 | 5.467288 | 4.023929 | 2.957017 | 0.975878 | 1.244335 | 2.397130 | -1.175507 | -4.311260 | -2.010513 | 1.793707 | -2.978524 | -2.942866 | -2.968929 | -4.012362 | 2.599007 | -0.673342 | 0.389229 | -4.544817 | 0.936654 | -0.054267 | 1.356068 | -2.214446 | 2.385404 | -3.277503 | 2.422619 | 0.241474 | -1.639611 | -2.802043 | 3.279203 | 2.500213 | 2.993028 | 1.383143 | 4.693300 | -0.287168 | 0.078249 | 1.049494 | 0.424390 | 0.622183 | -1.485018 | -0.564051 | 0.984760 | -1.900153 | -2.468657 | -2.375529 | 0.086182 | -1.336528 | 2.998337 | 0.382704 | 3.306110 | 2.711677 | -0.692410 | 4.425304 | 0.773117 | -0.958934 | -0.920124 | -1.103532 | -0.339738 | 0.931783 | -1.184184 | -2.234073 | -0.155069 | -1.503703 | 0.067834 | -0.463190 | -0.731143 | 0.863661 | 1.296464 | -0.849741 | -0.722638 | -0.210999 | -1.308981 | 0.235806 | -1.430597 | -0.237288 | 0.178717 | -1.730776 | 2.313768 | -0.968577 | 0.899299 | 0.238605 | 0.080473 | 2.185742 | -0.214632 | -0.301492 | -0.332081 | -0.869253 | -0.129182 | -1.566835 | 0.589326 | 1.399028 | -0.399901 | -0.032394 | -1.581242 | 2.346813 | 0.269430 | 0.752180 | 0.450931 | 0.817687 | -0.193733 | 0.352567 | 0.810399 | 0.242118 | 1.607885 | -0.099944 | 0.265651 | -0.918890 | 0.313787 | -1.475538 | -0.917841 | 1.361008 | 1.414864 | -1.027430 | 0.902929 | -0.978341 | 0.205309 | -0.478508 | 0.677058 | -0.635513 | 0.067804 | -1.271875 | 0.780901 | -0.285438 | 0.101320 | 0.759775 | 0.084448 | 1.0 |
| 12 | 0.539527 | 4.096706 | 2.525297 | 0.125871 | 0.849181 | -0.420920 | 0.131210 | -0.078330 | 2.299147 | 3.607499 | 1.502794 | 0.134593 | 1.130115 | 1.759323 | -0.734479 | -2.915767 | 0.389119 | 1.428041 | 0.535726 | -2.673697 | -0.920189 | 4.343998 | 1.752854 | -0.480720 | -0.075185 | -2.508279 | 0.538734 | -1.805669 | -1.950766 | -4.384284 | 0.595491 | 2.070058 | 2.437921 | -1.258597 | 1.869147 | 0.536585 | 0.244618 | -0.954428 | 1.293762 | 0.855486 | -3.555881 | -0.263912 | -1.397642 | -0.428500 | 0.652739 | 0.513783 | -0.490634 | -1.780083 | -0.723472 | -0.098514 | -0.272898 | 2.025827 | 2.591864 | -0.070724 | -2.452791 | 0.556255 | -0.784856 | -0.256013 | -0.136677 | 0.162828 | -1.459038 | -0.255619 | -0.645739 | 1.382250 | 0.876804 | -1.789161 | -0.849594 | -1.947011 | 0.526560 | -0.380529 | -0.012458 | -2.280283 | 0.636526 | -0.232733 | -1.274722 | 0.221960 | 0.371293 | 0.923486 | 1.006161 | -1.342963 | -1.335728 | 1.387616 | -0.026633 | 0.988509 | -0.656961 | -0.660103 | 1.423088 | 0.504884 | 1.289963 | -1.430962 | -0.952208 | 0.311595 | -1.034350 | 0.630541 | -0.271997 | -0.013851 | 1.076678 | -1.500910 | -1.547601 | -0.483428 | 0.930947 | -0.291097 | 1.382082 | -0.495461 | -0.001292 | -0.433390 | 0.104418 | 0.263285 | 0.270760 | 0.709565 | -0.571903 | 0.068781 | -0.493079 | -0.931927 | -0.434465 | -0.698075 | -0.152189 | 1.048231 | -0.535225 | -1.242185 | -0.475010 | -1.211517 | -0.160426 | 0.189087 | -0.301578 | -0.111499 | -0.634171 | 0.712601 | -0.033654 | 0.764845 | -0.873224 | 0.290934 | -0.011582 | 1.583003 | -0.883365 | 0.677032 | 0.077331 | 0.754877 | -1.602694 | -0.528182 | -0.211583 | -0.901058 | 0.370591 | 0.585430 | -0.997398 | 0.514579 | -0.456552 | 1.227706 | 0.217117 | -0.861072 | -0.535087 | 0.386361 | -1.134958 | 0.051019 | -0.150890 | 0.035162 | 0.605851 | -0.492680 | 1.0 |
| 14 | -2.374126 | 22.693246 | -22.255507 | 5.319930 | -4.994762 | 4.995794 | -0.163109 | 0.152872 | -4.272536 | -4.377114 | -2.223973 | -0.012304 | 1.094292 | 0.334486 | -4.448161 | -2.564457 | -1.287455 | -1.127504 | 2.903620 | 1.479645 | 0.288699 | 3.548992 | -2.552571 | 2.214341 | 1.834440 | 0.961153 | -0.380428 | 2.743778 | -0.402085 | 2.854520 | 1.625674 | -0.660409 | 3.713644 | -2.607831 | -1.006163 | 2.044437 | 0.715431 | -0.660141 | -0.913560 | -2.475820 | 0.404336 | -0.447138 | -4.346282 | -0.619852 | 1.686708 | -3.914504 | 4.049237 | 1.568886 | 1.217172 | 4.567500 | -4.212582 | -1.454013 | 4.940804 | -0.653738 | 6.554161 | 1.056602 | 0.901673 | 4.813135 | 2.669863 | -0.214445 | -1.481135 | -5.861116 | 0.570929 | -0.504327 | 1.763310 | 0.644947 | 0.721212 | -0.726589 | -0.622338 | -1.595334 | -1.067218 | 0.933752 | -3.239903 | -1.177951 | -2.257086 | -1.035539 | -5.759324 | -1.645341 | 1.776213 | -2.165516 | 1.545621 | -0.674933 | -0.108013 | 1.227573 | -1.286568 | -1.118812 | -1.475431 | 0.637640 | 1.552012 | 0.091567 | -0.122894 | 0.202455 | 0.420961 | 0.318991 | 0.975163 | -0.643336 | -2.718410 | 1.085262 | 0.983388 | -0.163329 | 0.279686 | -1.605667 | -0.882556 | -0.023623 | 2.870387 | -0.874302 | -0.326311 | 2.454123 | 1.440013 | 0.002438 | 1.344371 | -1.664960 | -0.185915 | 0.291363 | -1.635758 | 2.327299 | 0.557720 | 0.063808 | 1.060982 | -1.824013 | -0.015150 | 1.997989 | -1.919503 | -0.580002 | 0.596915 | -1.354827 | -0.507173 | -0.799870 | 0.253398 | -1.265421 | 0.511679 | -2.339331 | 0.983946 | 0.955528 | 0.515428 | -0.993575 | 0.938440 | -0.796613 | 0.736832 | 1.525735 | -1.573658 | 1.317566 | -1.759656 | -2.086890 | -0.710828 | 0.955384 | 2.503658 | 1.367572 | 3.855194 | -1.662581 | 3.479641 | 1.380408 | -1.289141 | -1.354862 | -1.135402 | 0.249233 | -0.616439 | 0.062429 | 1.0 |
| 15 | -2.298516 | 19.865789 | -17.993050 | 3.776964 | -4.093476 | 2.331331 | 0.439079 | -0.053698 | -3.233229 | -5.507317 | -1.325462 | -0.735285 | 2.420270 | -0.003283 | -3.827074 | -2.676825 | -1.959222 | 0.414136 | 2.507113 | -0.941510 | -0.865092 | 5.156960 | -3.463763 | 1.798822 | 0.210368 | -2.300466 | 0.045526 | 3.830971 | 0.209051 | 1.969175 | 2.368265 | -0.247671 | 3.756424 | -3.169911 | -0.357184 | 3.251802 | 1.507800 | -3.192979 | 0.767076 | -2.187417 | 0.678654 | 0.602860 | -6.132421 | -1.047581 | 1.633582 | -2.715821 | 5.409552 | 1.283143 | -0.132249 | 6.095938 | -4.138871 | -1.214501 | 8.086937 | -1.901384 | 6.798099 | -0.071173 | 3.681828 | 6.485591 | 2.452330 | -1.172071 | -3.144859 | -5.901689 | -1.116416 | -0.407399 | 2.379613 | -2.125138 | 2.559780 | 0.339292 | -2.182469 | -1.399273 | 0.341084 | 4.705881 | -3.915652 | -1.757349 | -1.385057 | -0.752720 | -6.283994 | -1.266796 | 2.238572 | -2.548664 | 1.837513 | 0.211306 | -1.736292 | -0.117958 | 0.936054 | -0.542998 | -2.436415 | 1.359733 | -0.721729 | 1.058691 | -2.261174 | 0.353583 | 0.821764 | 0.236032 | -1.187876 | -1.704876 | -1.054848 | 1.709939 | 1.377795 | 1.896296 | 1.606122 | -1.011592 | -0.571097 | -1.714601 | 1.161605 | -0.732492 | -1.166245 | 1.707696 | 0.833719 | 0.760004 | 0.200395 | -1.570417 | -1.087655 | 1.649829 | -1.679480 | 1.691316 | -0.134593 | -1.207867 | 2.884035 | -0.027486 | -0.468792 | 1.294946 | -1.791823 | -0.720031 | -0.762934 | -1.425091 | 0.674799 | -0.138870 | 0.271059 | -1.219723 | 0.555672 | -0.080559 | 1.612882 | 0.697683 | -0.644446 | 0.915810 | -0.179973 | -1.390245 | 0.558863 | -0.113259 | -1.199698 | 0.660199 | 1.213656 | -1.461542 | 0.182516 | 1.254318 | 1.590712 | 2.536058 | 0.229059 | -1.465241 | 0.712629 | 2.813534 | -0.217564 | -1.726012 | -1.379303 | 1.357969 | -0.492668 | 0.044341 | 1.0 |
| 17 | -0.990100 | 5.206377 | 1.893536 | -1.674476 | 0.647858 | -2.456414 | 1.740124 | 1.441631 | -0.365245 | 0.765032 | 0.567627 | 3.863442 | 0.276798 | 0.411704 | -0.411920 | 4.103970 | -2.562451 | 1.713536 | 5.320064 | -0.520487 | -1.250930 | 0.943507 | 4.362808 | -0.272436 | 5.504072 | 0.673386 | 3.854611 | -1.320148 | -0.182601 | -4.029668 | -3.261111 | -4.172925 | 0.220862 | 4.903544 | -0.529741 | 2.172299 | -5.740661 | -3.051935 | -1.029460 | -5.395702 | -5.402713 | 1.018740 | -1.793771 | 4.163271 | -1.282716 | -1.302404 | 10.297895 | -1.912754 | 1.424003 | 1.247851 | 5.047936 | 1.628517 | 1.984547 | -4.806117 | -2.285081 | -0.845424 | -5.044236 | -0.614534 | -6.828265 | 5.005563 | 1.106240 | 5.032007 | -0.549788 | -3.543400 | 0.622940 | 4.751093 | 0.230635 | 0.673976 | -3.896355 | -0.813849 | -0.760586 | 0.799755 | -8.036878 | -1.844287 | -5.066308 | 1.872009 | -1.845782 | 3.773109 | -0.177320 | -1.515225 | -1.739508 | -1.448608 | -2.747928 | 0.957083 | 1.206462 | 7.155135 | 7.972649 | 3.651278 | -0.728509 | -3.122582 | 1.139308 | -3.110291 | -0.677068 | -0.516617 | -2.464167 | -2.770245 | -0.961933 | 1.587293 | 1.076706 | -0.254680 | 0.443710 | 0.674802 | -2.030535 | -1.528126 | -1.251114 | 5.276860 | -3.495527 | -1.217096 | -0.377129 | 2.340822 | -0.863051 | 1.621783 | -3.270847 | -3.546612 | 0.438034 | 2.199936 | -1.984218 | 0.864283 | 0.440960 | -0.717970 | 2.337726 | 1.796033 | 3.740051 | 5.526047 | 1.769831 | 1.329832 | 1.613458 | 2.022319 | 0.047952 | 0.591038 | 2.163649 | -0.413294 | 2.595174 | -1.034930 | 2.839361 | 0.486714 | -0.133144 | -2.674762 | 1.467750 | 1.871733 | 2.188984 | -1.808990 | -3.507623 | -1.195868 | 0.134085 | 0.019895 | 1.070547 | -1.937416 | 1.557278 | -2.733617 | -3.325080 | 1.417138 | -0.897275 | -0.224957 | -1.049927 | -0.271980 | 0.143192 | 0.028721 | 1.0 |
1) SVM with random under sampling gives a sensitivity of 29% with a type 2 error rate of 71%, while predicting 5 observation to have failed, adding a
threshold of 0.3555 gives a sensitivity of 65% and reduces type 2 error rate by 50% with and increase in type 1 error rate by 27% while predicting 11
observations to have failed.
#Strafied cross validation to check how well Random forest with Random over sampling would perform on an unseen data
stratified_kfold = StratifiedKFold(n_splits = 10, random_state = 25,shuffle=True)
results= cross_val_score(rf_grid1, X_over, y_over, cv = stratified_kfold)
print('Accuracy Score')
print('Avearge: ', results.mean())
print('Standard deviation: ', results.std())
Accuracy Score Avearge: 0.9850097614803497 Standard deviation: 0.008771313477207204
#Strafied cross validation to check how well SVM with PCA with Random under sampling techniques would perform on an unseen data
stratified_kfold = StratifiedKFold(n_splits = 10, random_state = 55,shuffle=True)
results = cross_val_score(svc_cv, X_under, y_under, cv = stratified_kfold)
print('Accuracy Score')
print('Avearge: ', results.mean())
print('Standard deviation: ', results.std())
Accuracy Score Avearge: 0.699134199134199 Standard deviation: 0.10702801138594042
Based on the overall analysis and performances of the model we can go ahead and narrow down the observation that are common to the all the validations sets and infer that these are likely to fail. This definetly needs to be checked with a domain specialist to get an acceptable threshold limit of the Type 2 error rate and would be able choose the best performing model based on that, As far as PCA is considered, the models does better without it, Random Forest with Random overrsampling gives the best overall results.